Skip to content
Marc Juul edited this page Oct 24, 2017 · 3 revisions

Power usage

SX1276

Voltage: 3.3v

  • Transmit at max power: 120 mA
  • Transmit at lowest power: 20 mA
  • Receive mode: 11.5 to 12.1 mA depending on band

source: RFM95 datasheet

ESP8266

Voltage: 3.3v

  • WiFi transmit at max power: 170 mA
  • WiFi transmit at lowest power: 120 mA
  • WiFi receive for max TX power mode: 50 mA
  • WiFi receive for lowest TX power mode: 56 mA (no idea why it's higher)
  • Modem sleep (WiFi power save): 15 mA
  • Light sleep: 0.5 mA

source: Manufacturer statement on offical forum

Let's assume we're always using the max power mode for transmission.

Chat usage pattern

The web app is loaded once per user, taking e.g. 3 seconds and then cached (3 seconds is likely a large over-estimation). After this only a few bytes are sent over WiFi e.g. every 10 seconds per client for keep-alive and the chat data received over LoRa is relayed to each user.

For the ESP8266 the initial load will cost: 3 seconds * 170 mA * 3.3 V = ~1.7 joule

For the ESP8266 the incoming messaged and and keep-alive will cost a vanishingly small amount. Even at a tenth the maximum 802.11b wifi speed of 11 Mbit/s (which has a max TCP throughput of 5.9 Mbit/s) a 256 byte packet (a single DisasterRadio message) will take less than 4 ms to transmit and so will cost: 0.004 s * 170 mA * 3.3 V = ~2.25 mJ

When no data is being transmitted over WiFi the ESP8266 power usage will be: 15 mA * 3.3 V: ~50 mW

When no data is transmitted over LoRa the SX1276 power usage is up to: 12.1 mA * 3.3 V = ~40 mW

Transmitting a packet over LoRa will take a different amount of transmission time (Time-on-air or airtime) depending on the packet size and spreading factor. Let's assume we're using the maximum bandwidth supported by the SX1276 of 500 kHz and assume we have LowDataRateOptimize on for SF11 and SF12 and off for lower SF. Here are some different calculated airtime values and the power usage to transmit one packet:

  • SF12, 256 bytes: 2254.848 ms, ~893 mJ
  • SF12, 128 bytes: 1189.888 ms, ~472 mJ
  • SF11, 256 bytes: 1250.304 ms, ~496 mJ
  • SF10, 256 bytes: 563.712 ms, ~224 mJ
  • SF9, 256 bytes: 312.576 ms, ~124 mJ
  • SF8, 256 bytes: 227.968 ms, ~91 mJ

Calculated using this [https://docs.google.com/spreadsheets/d/1voGAtQAjC1qBmaVuP1ApNKs1ekgUjavHuVQIXyYSvNc/edit#gid=0](LoRa airtime calculator) created by a The Things Network contributor.

Another way to do this is to assume that no node can transmit on LoRa more than 10% of the time. In a given area a node interferes with any node within its range and if there are enough nodes within a nodes range that a 10% air-time causes issues then it should probably think about lowering its transmit power. 10% is actually probably a high number here but we're trying to work with highest realistic power usage to find the limits of the system.

Based on 10% airtime:

0.1 * 120 mA * 3.3 V: 40 mW 

So total usage when no Wifi transmission occur and the node is relaying LoRa at maximum allowed throughput:

(56 mA * 3.3 V) + 40 mW = ~185 mW

We might be able to lower this using the modem sleep feature to closer to:

(15 mA * 3.3 V) + 40 mW = ~90 mW

Let's add a new client loading the app every 10 minutes:

1.7 J / 600 s = ~2.9 mW

It is obvious see that the additional WiFi power usage for the keep-alive and sending of incoming LoRa messages to clients will be to small to count.

Conclusion:

Total power usage for chat: ~188 mW
Total power usage for chat: ~93 mW (modem sleep)

That means that every 24 hours the node uses:

0.188 W * (60 * 60 * 24) s = ~16243 J = ~4.512 Wh
0.093 W * (60 * 60 * 24) s = ~8035 J = ~2.232 Wh (modem sleep)

Translated to Li-Ion battery capacity:

4.512 Wh / 3.7 V = ~1220 mAh
2.232 Wh / 3.7 V = ~604 mAh (modem sleep)

Mapping pattern usage

ToDo figure out how many bytes an average vector map tile takes up and how many are loaded during normal operation.

Solar input

Taking Oakland, California as an example, the shortest day of the year gets 572 minutes of sunlight. This is in December which also has 14 days of average overcast days (coming in second after January which has 15). Performance of solar cells on overcast days is cited at between 10% to 20% depending on the source. We haven't tested our panels but let's assume 10%. (TODO It might even be lower since a smaller amount of sunlight will also be caught due to the lower angle of the sun in Winter, especially in an Urban environment). This gives us an optimistic minimum power collection capability per 24 hours of:

572 minutes * 10% * 60 seconds/minute * 3 W = 10296 J

Let's say our system for storing and retrieving the power is 80% efficient on average (TODO no idea what the real number is. 80% is probably overly optimistic). Then we actually only get:

10296 J * 0.8 = 8236 J

This is just barely more than the 8035 J that is our most optimistic estimate.

Solar li-ion charging

Some solutions we've thought about:

The Adafruit is non-MPPT but in the design notes they have some decent reasoning for that. The CN3791 seems to not be available in bulk from western sources. Is it still in production? Where would we buy it bulk? Maybe check with the company that made it.

Using a fully open design that uses no specialized ICs would be pretty cool since it gives us more control, but it might take more effort, or constantly running another microcontroller might take too much power, or it might be more expensive. Or these might all be non-issues: We haven't looked deeply enough to really know.

TODO research and test solutions

Overheating issue

Li-ion batteries cannot be safely charged at above 45 C. This will likely prevent charging during the middle of the day in the summer.

Let's say we get 12 hours of useful sun in the summer and we can't use the middle 6 hours. That's still:

6 hours * 3600 seconds/hour * 3 W = 64800 J

So overheating probably won't be an issue as long as we turn off charging when it gets too hot. However, this could become a real issue in some climates where ambient temperatures are already in the mid to high 30s in the shade on some days.

There are a few options to consider for hot climates:

  • Cooling using heat sink + fan
  • Using flat LiPo battery (larger surface area, easier to mount heat sink)
  • Using a lead-acid battery Gel/AGM battery (TODO can these be slow-charged at higher than 45 C?)
  • Using a Lithium-titanate battery which can be charged at up to 55 C
  • Using a super capacitor as intermediate storage

TODO look at Lithium-titanate battery price per capacity.

A quick search on digikey showed that you can get a 1 F 5.5v super capacitor for $2.75. This is fairly large for a capacitor but it only equates to 15 J so it doesn't seem very useful.

Voltages

  • The SX1276 can handle 1.8 to 3.6 V
  • The ESP8266 is max 3.6 V and no official minimum. One forum post said 2.3 V min, another 1.8 V.

The voltage from the solar panel will likely be 6 V since those seem to be the cheapest ~3 W cells available in a package that suits our needs. We will need some sort of solar Li-ion battery charging system, possibly MPPT.

The voltage from the battery will be between 3 V (maybe 3.3 V) and 4.2 V.

Stepping down voltage from battery

While there are many solar li-ion charging solutions out there we also need to figure out how to step down the voltage from 4.2 V to 3.6 V efficiently.

We could simply use e.g. an ultra low dropout voltage regulator like the ST LDK130 could work. These can operate with a minimum dropout of 0.1 V but it will probably be a bit higher at high load so let's say 0.2 V.

A voltage regulator simply burns off the difference in heat. This is obviously inefficient. At 4.2 V it would burn off:

(4.2 - 3.6 - 0.2) / 4.2 = 0.19

Or 19% (TODO is this correct?)

We can switch the converter out of the circuit at battery voltages of 3.6 V and below. A li-ion discharge curve will show that 3.6 V and below only occurs when the battery is discharged to around 20% capacity, so for 80% of the battery's capacity it will run through the converter.

If we estimate an average voltage of 3.8 V during the discharge between 4.2 and 3.6 (voltage drops rapidly from 4.2) then the average loss is closer to 12 % during the first 80 % of the discharge, so a total loss of around 9.5%. If we assume we're never charging the mattery t more than 90% capacity (to increase battery lifetime) then the average voltage might be closer to 3.7 V, changing the total loss to around 7 %.

Of course in some situations the battery will stay close to fully charged, which might mean an average voltage closer to 4.0 V and a total loss closer to 14 %. In other situations the voltage might never climb above 3.6 V in a 24 hour period and thus the loss would be 0.

Another options is to use a buck converter. Small high-efficiency 3.3 V buck converters exist, such as the LM3671 which has an efficiency of 90-95%. This option might be better in some scenarios but might also be more expensive given the need for a few support components. A pin-compatible breakout is available from Adafruit.

TODO calculate price difference and efficiency difference between various solutions