What does the voltage Rating on a capacitor Mean
Keep in mind that a good rule for choosing the voltage ratings for capacitors is not to choose the exact voltage rating that the power supply will supply it. It is normally recommended to give a good amount of room when choosing the voltage rating of a capacitor. Meaning, if you want a capacitor to hold 25 volts, don't choose exactly a 25 volt-rated capacitor. Leave some room for a safety margin just in case the power supply voltage ever increased due to any reasons. If you measured the voltage of a 9V battery supply, you would notice that it reads above 9 volts when it's new and has full life. If you used an exact 9-volt rated capacitor, it would be exposed to a higher voltage than the maximum specified voltage (the voltage rating). Usually, in a case such as this, it shouldn't be a problem, but nevertheless, it's a good safety margin and engineering practice to do this. You can't really go wrong choosing a higher voltage-rated capacitor than the voltage that the power supply will supply it, but you can definitely go wrong choosing a lower voltage-rated capacitor than the voltage that it will be exposed to. If you charge up a capacitor with a lower voltage rating than the voltage that the power supply will supply it, you risk the chance of the capacitor exploding and becoming defective and unusable. So don't expose a capacitor to a higher voltage than its voltage rating. The voltage rating is the maximum voltage that a capacitor is meant to be exposed to and can store. Some say a good engineering practice is to choose a capacitor that has double the voltage rating than the power supply voltage you will use to charge it. So if a capacitor is going to be exposed to 25 volts, to be on the safe side, it's best to use a 50 volt-rated capacitor.