A constant voltage of 10.00 V has been observed over a certain time interval across a 2.50 H inductor. The current through the inductor, measured as 2.00 A at the beginning of the time interval, was observed to increase at a constant rate to a value of 9.00 A at the end of the time interval. How long was this time interval?
The time interval was 7/4 seconds. I got this the following way: Voltage accross an inductor: V L * di/dt V voltage L inductance di/dt rate of change of current V / L di/dt 4 di/dt Anti derivative gives: I(t) 4*t + C (C is a constant. or initial value of current in this problem) Therefore: I(t) 4*t + 2 Now what is the time when the current was 9A? 9 4*t + 2 7/4 t Now the time when the current was 2A? 2 4*t + 2 t 0 The difference in these times is how long it took the inductor to charge up to 9A