Hi guys.
Just curious when you are running DNC, do you enter the "DNC" or DNCX" command ?
I've been using DNC for a couple of years without issue, wondering if there is any benefit in going to DNCX (w/xmodem smarts) ?
Hi guys.
Just curious when you are running DNC, do you enter the "DNC" or DNCX" command ?
I've been using DNC for a couple of years without issue, wondering if there is any benefit in going to DNCX (w/xmodem smarts) ?
Never heard of DNCX.
Let me know of any benefits as well.
www.integratedmechanical.ca
Shizzlemah--
The DNCX command is an old command that was required in the early machines. The machines of today only need the DNC command. If you send Xmodem protocol to them they will recognize it and adjust automatically. It will recognize only the 128 bit packet Xmodem.
For all DNC I highly recommend using only Xmodem.
Neal
Neal,
First off, I *really* appreciate your contributions here. There were some comments in the manual about DNCX only being used with the -3 CPU and that confused me thoroughly. Esp since "MU" on my -5 shows both DNC and DNCX
MY DNC SW doesn't support Xmodem, so I tried a couple of terminal programs yesterday but didn't have the right combination. Do I want to be 7E1 for PU,TA,DNC w/o xmodem,#SPRINT,(etc) and 8N1 only for DNC w/xmodem ? With Xmodem, do I need to start sending at the PC prior to initiating the receive at the Fadal ?
Will anything impacy my ability to use #SPRINT when not in DNC (or DNCX) mode?
My current procedure is to "CD,9" "DNC" at the fadal, then load the program into the DNC SW, and then send it to the Fadal using standard SW handshaking. Does this have to change at all if I send using Xmodem?
Thanks much!
Darebee,
In standard serial comm, there is a parity bit with every byte. It isn't super reliable as it will not detect two (or 4 or 6) bits flipped, but when it detects a bad byte, it will abort communications. Your DNC will stop.
With x-modem, 128 charachters are sent at a time, with one checksum for the lot. Again, it will not detect an even number of bad bytes. But the 128-char message is also sent with a sequence number. If for some reason the checksum doesn't match, the 128char chunk is resent up to ten times (or more possibly). This lets the comm link self recover.
Xmodem/128 was the earliest of the x/y/zmodem protocols. X-1k came later with a (drum roll please!) 1kbyte string.
Using 19,200 baud rate is fine as long as you do not data starve. Always use 7E1 with the Fadal Control. When not using DNC mode the #sprint statement requires no special settings.
Neal
Do I still use 7E1 with xmodem ? I thought Xmodem required 8, and conventionally used no parity bit.Always use 7E1 with the Fadal Control.
the manual, chap 14, "software" "b" (pg 333) says 8-N-1 with X. Is that true? I'm certainly not confident - still havent got the magic combination yet!!
071220-0951 EST USA
Shizzlemah:
You are correct that Xmodem requires 8, N, 1 or 2. Xmodem uses all 8 data bits. There is no parity because this would cause a parity error and interfere with the simple algorithm used in Xmodem. They could have used parity as a secondary check but did not.
Recent UARTS based on the 16550 always assume 1 stop bit on receive. Thus, the stop bit selection only affects the transmission direction and using 2 stop bits slows your transfer time by 10 %.
.
Xmodem on the Fadal control does NOT requires 8N1. You may continue to use 7E1 and are not required to make any changes from Xon/Xoff to Xmodem.
Neal
071221-0755 EST USA
Neal:
Your statement means that on receiving at the CNC under Xmodem you ignore the 8th bit, which for 7E1 is the parity bit, or you use it as parity and as supplemental information to the checksum. This might mean you set your UART to 8N1 and mask off the 8th bit, or the UART is set to 7E1 and you do something with the parity error if it occurs.
What happens if you use a generic Xmodem program at the computer end and send data from the CNC to the computer when the computer is set to 7E1?
.
gar--
We use the 128 bit packet with the header and footer for a total of 132 bit packet and the protocol of 7E1 is hard coded in the e-proms. Unless the internal software make this change in the background when it receives Xmodem the 7E1 remains in effect. If this change does occur in the background, it is against what I was lead to believe from our Engineering group years ago.
I am having problems setting up my communication between my computer and my 1988 fadal 2040 I just aquired .Can someone explain the procedure.I am trying to use NCfadal but I keep getting the message that no connection found
.
After countless attempts to figure out why xmodem wasn't working on my 1993 3016 (1400-4 controller board), I used 8N1 instead of 7E1 and it worked the very first time I tried.
Until trying 8N1, I kept getting "Input Xmodem Transmit Block Missed Error" when I attempted to transfer a file via xmodem.
I have this formula working on Linux:
(Linux) Download xtransfer.c from this Gist (thanks go to v3l0c1r4pt0r): https://gist.github.com/v3l0c1r4pt0r...le-xtransfer-c
(Linux) Compile with:(Fadal)Code:gcc -o xtransfer xtransfer.c(Fadal)Code:CD,10(Linux)Code:DNCX(Fadal) Press startCode:sudo ./xtransfer -t -d /dev/ttyS4 -b 38400 -p 8N1 -v /path/to/an/nc-file.nc
(Fadal) Once finished, press Slide Hold then Manual to exit DNC/DNCX mode.
At-Man turned me onto the idea of changing the connection settings. The PLC-based NAS/drip-feeder he describes works in two modes: 7E1 for commands (anything you would type after "ENTER NEXT COMMAND") and 8N1 for xmodem transfer (file upload/download or DNC):https://www.youtube.com/watch?v=UJjYKtqog5Y