A confidence interval is an interval used for the estimation of an unknown parameter while at the same time reflecting the precision or uncertainty (by using its width) incurred in the estimation. The wider a confidence interval, the lower the estimation precision is.
A confidence interval bears a certain level of confidence. For example, a 95% confidence interval bears 95% chance of including the true parameter value. Therefore, the higher the confidence level, the more we are assured where the true parameter value is. However, while a 100% confidence interval sounds to be good, it is not practically useful (Why?). The most common choice of confidence level is 95% but it may also be 90% or 99% in certain applications.