I am a little concerned having read reports and seen video of total loss of power events caused by the BMS. These have been described by others as “hard cuts.” What appears to occur in these cases is that the voltage of a cell (or cells) sags when power is requested, reaching the low voltage limit, and the BMS abruptly cuts all current. When cells are well balanced, this will occur at a low state of charge. When cells are out of balance, like in the case of a single failing cell with high internal resistance, this could occur at a high state of charge under high load.
I see this exact sort of situation in my 10 year old electric car. My car appears well balanced at full charge, but is widely out of balance when empty and under load.
Individual cell voltages must be reported to the throttle controller. The throttle controller should account for cell voltages and limit current to keep cell voltages above a configurable threshold that is set with a safety factor above the BMS low voltage limit. When the throttle controller is operating in this limited current mode, some feedback should be given to the pilot (perhaps a vibration) to let them know that full power is not available and that they should land soon. My car does this by displaying a turtle shaped warning light on the dash.
This is an issue that should be addressed in the design soonest. Otherwise, I see potential for someone to get hurt by one of these hard cut events at low altitude. As people’s packs age, this will present itself at higher and higher states of charge.
I’m not a programmer, so can’t make the changes myself, but happy to consult further with anyone who can modify the code.
For the time being, the best way to protect yourself is to keep a good eye on the health of your battery. The best way to do this is check cell voltages after flying. They should remain well balanced, even through a rapid discharge. If they appear out of balance at low states of charge, this is the first indication that the lowest voltage cells are starting to fail. An alternative, but less practical method is to look at cell voltages under load. The voltage of the weakest cells will sag the most.
I would like to show you an esc in the picture that I have been using since 2012. You have the option (in my case up to 15 S since we are not allowed to have higher voltages in most of Europe) to connect to the ESC. (K4 / K5) you can then set the software as an example as soon as a cell group under e.g. 3 volts that the performance is throttled or just a maintenance goes out. the designers of the esc have been making many features for decades. however, these are often not known to diy or small startup companies.
picture from MGM Manual HBC Series
I agree, the power should be limited if any of the battery voltages trip the low threshold. This is safer than the BMS just cutting all power (engine out). One way to do this is shown in the diagram I made below. The BMS is not passing the battery power, just handling charging cutoff, balancing, monitoring the cell voltages/temp, then relaying this information to the hand throttle MCU for example. If the battery voltage drops below the threshold, the throttle controller will limit the power command to the ESC.
This is very similar to what EGO in the power tool industry does with their batteries/tools: Understanding and Using EGO Power+ Batteries - Endless Sphere
I am not sure what model of BMS the SP140 uses, however it is pretty common for the BMS to have a data output (wired and Bluetooth). Example: https://www.aliexpress.us/item/2255801015713325.html?spm=a2g0o.order_list.0.0.42b61802va7U9k&gatewayAdapt=glo2usa4itemAdapt&_randl_shipto=US
just as an explanation for everyone who already has a lot of experience with batteries and battery construction:
If you want the maximum performance from battery cells, for example. 18650 or 21700 should not be used in the usual way with the bms in the battery case. the bms generates heat which then reaches the cells that are next to it and creates a temperature difference in the battery housing. this leads to a change in the internal cell resistance when discharging at high power. Therefore, many rechargeable battery designs have the problem that the voltage of the cells drifts due to different internal resistances or output power due to the temperature
one possible solution is to manufacture the housing from stainless steel or titanium sheet and to construct the bms in such a way that the cooling surface reaches the outside through the largest side surface. this has proven itself in practice for a long time.
from here only solutions for diy and custom!
another solution is to house the bms in an external housing and connect it to the cell block by plugging it together and securing it. but this is only possible for diy people. commercial series production is basically not possible because cell blocks without bms may not be sold to end customers.
in ultralight aviation, this method is often used for self-construction.
as well as doing without a bms - balancing completely and leaving the management to the balancing charging technology.
the esc can take over the function again and monitor the individual cell blocks in flight and, if necessary, throttle or switch off if a cell group falls below the target value. eg. 3V under load.
although this basically never happens because no bms in the housing leads to cell drift due to heat influences.
(If cells ever drift significantly, you will see this automatically during the next balancing charging process.)
the cells give off energy and are all pretty much identical in terms of voltage. provided the housing is not partially air permeable or designed incorrectly.
Hi, I have the same intention and started to look into doing this, but have been busy since I got my machine.
The BMS is a “Daly” one, it has quite a comprehensive manual and there is already serial data being sent out for the Bluetooth dongle.
This could be picked up with an extra line to the handle, it would need a second UART RX pin to accept the data, parse it and check the cell voltages. There is also an “alarm” bit in the BMS data so the actual threshold could still be adjusted using that interface.
Anyway then the handle should warn the pilot early enough (it should be able to do this based on Ah discharged) and then buzz the handle and finally start to reduce the motor power as the lowest cell hits the limit.
Once that is done I wanted to reconnect the ESC directly to the pack (with a fuse), instead of through the BMS FETs.
What I need to find next is the schematic for the current handle version - I didn’t see it in the GitHub.
Can you provide a link to the Daly BMS publications?
Part 4_ Daly RS485+UART Protocol (1).pdf (269.2 KB)
Part 3_ Daly CAN Protocol.pdf (329.4 KB)
Part 2 - Daily Smart BMS PC App - DL-R32S模块用户手册 (1).pdf (1022.6 KB)
Part 1 - Daily Smart BMS Manual (1).pdf (2.2 MB)
Can anyone provide the handle schematic? I think it is using the RP2040 chip - https://datasheets.raspberrypi.com/rp2040/rp2040-datasheet.pdf
On this device it does have two UARTs, but they are on shared GPIO pins with other functions, so I need to find out the existing mapping and if it’s possible to bring the extra RX out. I guess a soft UART is also possible though.
By the way there are a few existing threads with discussion about this for background (SP140 - Battery low power cut off mechanism & thresold - #12 by evan), and many others where people experience “hard cut” but let’s try to keep solving it in this one
Thanks for sharing data on the BMS Evan. The RP2040 device is indeed used (RP2-B2 marking).
The ESC uses UART and the Display looks to connect over SPI. There are 2 total UART instances available on the RP2040, so one should be remaining if there are enough proper pins left. If these are not routed out, this could be a problem as the RP2040 IC is rather small (hard to blue wire to ) A full schematic would still be helpful.
Another possibility is to use a separate microprocessor that collects the BMS Data, and acts as a bridge between the hand throttle and ESC.
Also I found some documentation on the DALY BMS UART protocol: