I'm building a stepper driver using 5804B IC's which are rated at 1.25A. The steppers I have are rated 5.1 Volts, 1.3 Amps. Which I think means they would have a resistance of 3.923 Ohms (I'll call this 4 Ohms).
I have run one of the steppers from the +5V of a computer PSU and it worked ok, but really needs to use a higher voltage.
If I use a 12V supply the motor would draw 3 Amps blowing itself and the 5804 up, so I need a current limiting resistor. I can use a 6.8 Ohm resistor in series with the motor for a total of 10.8 Ohms. 12/10.8=1.11Amps (nice and safe), but how do I calculate the power rating of the required resistor. Do I use the total resistance of 10.8 Ohms.
e.g. 12V/10.8 Ohms=1.11A. > 12V*1.11A=13.32 WATTS
Or just the resistance of the resistor.
e.g. 12V/6.8 Ohms=1.765A. > 12V*1.765A=21.18 WATTS
If I go higher with the voltage the difference in the calculations gets much larger, along with the cost of the resistors.
Maybe I have done this totally wrong, but I'd really appriciate any help on the subject.
I plan on eventually building a PWM circuit for current limiting, but that's a way off yet (if ever).