Battery Monitor Design for Weather Station

General project help for Adafruit customers

Moderators: adafruit_support_bill, adafruit

Please be positive and constructive with your questions and comments.
Locked
User avatar
nealn
 
Posts: 10
Joined: Sat Oct 08, 2011 7:53 pm

Battery Monitor Design for Weather Station

Post by nealn »

Hello,

And, ugh, yes I know another battery monitor post. And, yes, I am new to the world of microcontrollers and have dabbled quite a bit in electronics before, but it has been years.

I am designing a weather station and having sorted out all of the basic criteria with respect to sensors, commands and telemetry. I am left to power considerations. I have gone back and forth between 7.4 V Lipo packs or 6 V gel cell lead batteries. My uC will be running at 5V. It's an ATMEGA328 @ 16 Mhz, so basically your standard arduino, without the form factor. So, I suppose I could run on a single cell Lipo @ 3.7 V nom, but I may be over clocking the chip some (unless I use a variable voltage regulator to bring the uC into spec.... hmm... I will think of this later). Anyways, I digress.

At this point, I am sticking with 6V gel cell.

I am considering the following to provide periodic voltage telemetry for the main power battery:

1. I will use the internal Vref of 1.1V
2. I will be using a voltage divided comprised of 113 kohm as "R1" and 20 kohm as "R2" (this should get be a scaling factor of about .15, making the full scale about 7.3 V for the Vin side)

Now the questions.

I remember reading/hearing that the values of the resistors above 10 kohm in a voltage divider can be detrimental to the performance of your circuit. this makes sense to me if I was actually trying to provide power to a load, but in reality, I just need to measure the potential right? So, I wonder how the ATMEGA will handle such a small current. So, can anyone provide concurrence to the notion that I should confine my resistor selection to a much lower value? Or is that all bunk?

Secondly, I do not necessarily want to continue measuring the voltage of the battery. So I was thinking that I could simply place a cheap 2n3904 transistor in line with the V+ line on the battery on the way to the voltage divider, drive the base with a digital output pin via a 1 or 10 k resistor and tie the emitter to the "top" of the voltage divider. In this configuration I could switch on the battery measurement circuit once an hour, or whatever, and not pay any power penalties with continuously driving the voltage divider. I know there is some current flow from the base to emitter when the transistor is on. I cannot visualize the impact on the reading of the battery voltage reading, but if it is there, I suppose I could calibrate the resulting reading after some empirical data. Any one see any issues with this configuration?

Thanks in advance,
Neal

User avatar
lyndon
 
Posts: 281
Joined: Tue Mar 29, 2011 5:28 pm

Re: Battery Monitor Design for Weather Station

Post by lyndon »

I think you're fine; forget about the transistor. At such a low power consumption, you're probably below the internal self-discharge of the battery. If you're using a high resistance divider, then the input impedance of the A/D circuitry can affect the output of the divider, but that's the only problem I can see. Even that is probably OK, since you just need fairly gross battery measurements to tell if it's getting low.

I'd spend more time on a battery monitor circuit. Instead of just measuring the voltage, maybe use that transistor to switch a 100mA or so load on every day. By measuring the battery voltage loaded and unloaded, you should be able to see a faster change as it loses capacity. e.g., place the load on for 1 second. Remove the load, wait one second, read voltage. As the battery loses capacity, the second reading will get lower and lower much faster than only reading the unloaded voltage.

User avatar
nealn
 
Posts: 10
Joined: Sat Oct 08, 2011 7:53 pm

Re: Battery Monitor Design for Weather Station

Post by nealn »

Thanks for the response. I went back and did a little bit of math and measurements, it looks like I will be losing about 60 uA to ground with this divider. I'll live with that.

Now, when we talk about input impedance for the A/D circuit, how could this impact the readings from a high resistor divider? I am pretty rusty here. Is this something that would be expected as a constant? (i.e. I could just characterize an offset?).

Thanks,
Neal

User avatar
lyndon
 
Posts: 281
Joined: Tue Mar 29, 2011 5:28 pm

Re: Battery Monitor Design for Weather Station

Post by lyndon »

The rule of thumb I was taught was the input should have 10x the impedance of the output it's connected to. Let's say you have a 2:1 divider composed of two 1M resistors. It's output impedance is 500k so you'd want an A/D with at least 5M ohm input Z. The input Z of the ADC appears as a parallel resistance to your divider.
If you were to use an ADC that had an input Z of 1M then your voltage divider would have an output:input ratio of 2M/3M= 0.66 instead of 0.5 - that's a 32% error

Here's a quick diagram:
Image
(just noticed I screwed up the text; the 0.666V should read "with ADC")

[edit]
Just noticed another mistake. It should be .5/1.5= .333V not .666

But if you change the ADC to one with 5Mohm input, the divider gives an output of 0.54x the input which is only 8% error. This error drops as the ADC input impedance gets higher.

Hope that helps
Last edited by lyndon on Tue Oct 11, 2011 7:03 pm, edited 1 time in total.

User avatar
nealn
 
Posts: 10
Joined: Sat Oct 08, 2011 7:53 pm

Re: Battery Monitor Design for Weather Station

Post by nealn »

Thank you very much for that. So, to restate it, the input impedance of the ADC is actually from the input to ground. I had until now always imagined it as a series resistance to the input. Is this a case just for ADCs alone? That being said, if it is possible to characterize the input impedance of the ADC then it should be possible to adjust the value of R2 to such that :

1/(desired R2) = 1/(Zinput) +1/( R2) , just solve for R2.

Right?

User avatar
lyndon
 
Posts: 281
Joined: Tue Mar 29, 2011 5:28 pm

Re: Battery Monitor Design for Weather Station

Post by lyndon »

Input impedance is in parallel with the input terminals. Series resistance may only be part of it. In a simple ADC like on a microcontroller, there may be no buffer so you will be concerned with the dynamic behavior; i.e., how the input behaves when it samples your output signal.

Honestly, if the impedance of the A/D causes a problem, I'd add an opamp buffer rather than attempt to characterize it somehow.

User avatar
nealn
 
Posts: 10
Joined: Sat Oct 08, 2011 7:53 pm

Re: Battery Monitor Design for Weather Station

Post by nealn »

I see what you are saying. I seem to remember that now from a previous career. We had unity gain op amps before ADCs in a board and I remember wondering why.

For my case, I'm not going to bother with the buffer, but based on the description of the error rate, it looks like it won't vary much with each reading. So, maybe this is something I can calibrate easily. But it does not really matter, since I will just be using it as a "recharge me" indicator.

Thank you very much for the information and time,
Neal

Locked
Please be positive and constructive with your questions and comments.

Return to “General Project help”