For a first-order reaction, the rate constant (k) is 0.02 s^-1. Calculate the time it takes for the concentration of the reactant to decrease to 10% of its initial value.
Guide On Rating System
Vote
The integrated rate law for a first-order reaction is:
ln([A]t/[A]0) = -kt
Where [A]t is the concentration of reactant at time t, [A]0 is the initial concentration of reactant, k is the rate constant, and t is the time.
We want to find the time it takes for the concentration of the reactant ([A]t) to decrease to 10% of its initial value ([A]0). In other words, we want to find the value of t when [A]t/[A]0 = 0.1.
Plugging in these values into the integrated rate law, we get:
ln(0.1) = -0.02t
Solving for t:
t = ln(0.1)/(-0.02)
Using a calculator, t ≈ 34.657 seconds (rounded to 3 decimal places).
Therefore, it takes approximately 34.657 seconds for the concentration of the reactant to decrease to 10% of its initial value.