Hello! Noob but 79 yo and trying to learn new stuff.
I need to determine how to size a power supply for a USB charger station that I want to make. I plan to start with a 13-port USB hub but I will not use it for data, only for charging USB devices including smartphones and iPads and other USB devices that must me charged.
From what I've read on the Internet, smart USB devices (phones, iPads, etc.) will draw 10 Watts and 'regular' USB devices will draw 2.5 Watts all at 0.5VDC. Further, the information I found said that smart devices can require up to 2100 mA (milliamperes) each and regular USB devices want 500 mA. If that information is NOT correct, please tell me where I should look for more accurate data.
The power supply that came with my USB hub takes 120-240 VAC at 50-60 Hertz and outputs 0.5VDC at 2500 mA. I would like to supply enough power to the modified hub (I plan to short the data terminals on each port) to be able to charge up to 6 smart devices and 7 regular devices at once, safely.
If I remember my lessons on household electricty circuits, one sets the maximum amperage draw in order to choose a circuit breaker (or in my early days, a fuse). That limit is determined by adding up all the power (wattage) that will be consumed at one time on a circuit that has 110-120VAC at 60 Hz. IF I REMEMBER THAT INCORRECTLY, please advise. Also if the analogy of household current to DC current changes things, please educate me.
Again, if I add up power requirements as wattage using the information in the paragraph above, my charger will have to be able to handle a total of 77.5 watts. I used to add about 50% for household current: Should I do the same for this DC charger? Please advise and explain as I am close to being off my reservation. Therein lies my dilemma. I don't see power supplies telling me anything about wattage, only amperage.
How should I go about calculating the spec I should shop for in a replacement power supply? From what I know - or suspect - the input voltage remains at 120-240 VDC at 50-60 Hz and the output must remain at 0.5VDC because that's what USB devices. The question, I think, becomes how many mA must the power supply provide.
Here is my thinking; correct me if I'm wrong. A charger is supposed to put power into storage in devices being charged. A charger does this over time. I think - again correct me if I'm wrong - USB devices are built with circuits that limit the total power they will store. That leads me to believe that the power flowing from the power supply to the charger will be shared by all the ports with each device being charges signalling the charger when it is fully charged and will not draw further power. Does this mean that I need to add up all the mA? I think NOT, but I know next to nothing about electronics. My guess is that the amperage spec for a chargers in this application is very unlike the amperage for a circuit breaker in a household circuit. Again, using the data provided in the paragraph above, cumulative amperage for all circuits simultaneously would be 7200 mA or just over 7 amps. Should I be looking for power supply that provides approximately 8 or more amps (8000 mA) at 0.5VDC?
Please help this old man understand what he's dealing with here.
TIA
rh
The best power supply for your PC build is the one that provides the right amount of wattage to all components simultaneously. Manually calculating this requires that you multiply the total amps of all components by the total volts of all components. The result is the total watts that your PC build requires.
The best power supply for your PC build is the one that provides the right amount of wattage to all components simultaneously. Manually calculating this requires that you multiply the total amps of all components by the total volts of all components. The result is the total watts that your PC build requires
Sourav Gupta
PermalinkThats a lot of information. Well, I belive that is not 0.5VDC but 5.0VDC. However, as far as I understand the main issue is with charging a device (phones, ipads etc.) Those device draws current upto the rating of 2A (On an average). Howver, modern days smart phones still use 5.0V DC but they also supports higher voltages as well such as 9V, 11V etc. But for a easier way yo understand, it is 5.0VDC 2A that supports by almost all phone. Now who decide the charge of current. Well It is decided by the device and charger both. If a 1A charger will connected to a 2A of load, the charger will go into over current mode. In the other side of the coin, if you connect a device across a charger that could provide, lets say 8A of current, the device will heat up and will be destroyed if it pull 8A of current. Thus, you should provide a slightly larger amount of current while designing a charger. It is best to create 2.2A output charger for a 2A charging device.
Now There are different ways to design a charger. Its depends on the port, total current and many more things. It is good to make a power supply that provides total accumulated wattage and individual DC-DC converter will provide the required charge current at individual port.
What is your product spec from input side and output side? Do not provide large info, just the spec of the input and output.
- Log in or register to post comments
Joined February 12, 2018 696Monday at 02:11 PM