ADVERTISEMENT

Downgraded phone (again) for battery life

UCFBS

Todd's Tiki Bar
Gold Member
Oct 21, 2001
28,531
10,647
113
USA
It seems I do this every few years, especially the more I travel. I get tired of carrying batteries.

Instead of going with a Snapdragon 845 or similar, I just 'downgraded' from my current Snapdragon 835 (Kyro 280 + Adreno 540 GPU) smartphone to a Snapdragon 632 (Kyro 250 + Adreno 506 GPU) and I'm massively impressed. I'll be damned if I can tell the difference in performance. Benchmarks are the only place I can see it. Games are the only place I can see it, although the former 'flagships' usually have high resolution that kills framerate, let alone battery even further, while the latter is often used in lower resolution displays, and work 'good enough.'

At the core, both the Kyro 280 and 250 are quad Cortex A73 power cluster (the big in big.LITTLE) dersigns, the former at 2.4-2.45GHz and the latter at only 1.8GHz, and the former having 2MiB L2 to 1MiB in the latter. But they are still A73s, and much better than an A57, let alone the A53, which is usually the efficient cluster (the LITTLE in big.LITTLE) often around 1.8GHz. This is a far cry from the earlier Snapdragon 630, Kyro 250 and other, A57 or A53 'big' designs, which are only about 60% the performance at the same clock.

The phone? The Motorola G7 Power. Man, I'm absolutely impressed. It's everything I've ever wanted in a phone, especially when I travel 100%.

I thought going from a few of years (since 2016) at 1920-2880 wide and 1080-1320 deep resolution to 1520x720 (19:9 -- figure really 1440x720 for 18:9, taking the 'notch' out of it) would be noticable. Other than in Kindle or some other reading can I really tell a difference. Most of the videos I watch are 720p30 or 60 any way, and if not, then I'm on my Fire HD10.

SIDE NOTE: I've also run those higher resolution phones at a downgraded 720 deep for power efficiency. But it doesn't look as good as an actual, 720 native display when downgraded. So one might as well run as the higher resolution ... or as I have ... gone with a 720 display.

And then there's the battery ... a massive 5000mA. That's larger than the 3900mA I used to have back in my Huawei Ascend Mate 2 circa 2014 (which had a 'reverseable charging cable' and I'd even 'charged up' some 'dead' iPhone users's phones to 30-40% on a plane with it before). I stopped charging it at 11pm last night, and I'm at 83% right now, over 18 hours later. And I've used it heavy for about 6 hours total, 2 for video, 2 for mail/surfing, etc... 2 for music. And my company's IronMobile has been hammering it for 6 hours too ... and likes to suck battery.

On my prior phone I would be dead after 18 hours, let alone if I took it off charge, and used it 6 hours continually, it'd be under 40% by now, if not close to 30%.

Between the low-clock, 1.8GHz quad-A73 power cores in the Kyro 250, the low resolution, the 'adequate enough' for 720p Adreno 506, I'm really impressed. It works. It does the job. I've used cheap Snapdragon 400 and 600 series units before, usually not my own, but my wife's, or a friend's, or a backup, but the 632 really gives the G7 Power spunk with that Kyro 250 which has actual, real A73-based cores. So it's not sluggish at all. And only if I really looked close at the screen would I be able to tell I'm not pushing as many pixels, and many the clock and GPU aren't as beefy as a result.

And it does all the new 700MHz and even 600MHz bands, like T-mobile Band 71 -- as long as you get the US warrantied version (not the international version). And yes, there is an unlocked, but US retail version, you don't need to get it from your carrier, and you don't need to go international (which are often even cheaper). It does almost all the North American LTE bands outside the US, and a number of European bands. All for $249 retail -- from $59-149 if you go for a carrier locked version from that carrier -- if not multiple free (like an entire family) for those who switch.

Sure, the core+radio is going to limited, more 300/150Mbps down/up, whereas some of the latest ASICs are pushing 1Gbps, although many others, especially just recent flagships, are limited to similar speeds under 500Mbps too. In any case, even the 4G LTE Advanced Pro (what AT&T markets as 5G Evolution) is not capable of pushing a single device remotely close to that speed, but well under 300Mbps.

Anyone else in my boat, and tired of carrying an external battery? If so, and you want newer band support, especially on T-mobile, although it supports all the US CDMA and GSM carriers, consider the G7 Power.

About the only 2 negatives are ...
  • The auto-dimming is a bit annoying, but it seems to be 'learning' my 'brightness level' after 18 hours
  • The camera setup is behind the times, but Moto has other devices in the G7 line for better options (and the Nokia 7.1, along with Huawei's youth Honor brand, if you're not against Huawei, looks good too) ... the Power is about, well, battery life
One note ...
  • The US retail version has only 3GiB RAM (and 32GiB storage -- MicroSD up to 1TB means not a problem), not 4GiB RAM (and 64GiB storage), fine for Pie, but in another 2 Android versions (about the time 5G hits), it will be a bit of a limit
Reviews ... (specific to the G7 Power edition)
 
Last edited:
  • Dislike
Reactions: KnighttimeJoe
I downgraded about two years ago to a Moto G5 plus. It's been a great phone--great battery life, so-so camera.

How's the camera quality on the G7?
 
I downgraded about two years ago to a Moto G5 plus. It's been a great phone--great battery life,
The Moto G7 Power takes it to a whole new level beyond the Moto G7 (US/International) or G7 Plus (International-only) -- ignoring the G7 Play (which everyone should skip). I cannot emphasize this enough ...

After 18 hours, being at 83% with various usage, including nearly 3GiB, I downloaded nearly 20GiB in a building with moderate signal and, over those 4 hours of 'heavy' usage, I only dropped 20%, from 83% to 63%. This included lots of video and nearly all 4 hours were on-screen.

That ... and ... one of the reasons I refused to get a Moto G-series before, like most mid-ranges ...

Is that their ARM designs usually used a Cortex A50-series, especially the in-line, non-speculative A53. There's little reason for an octo-core with the separate LITTLE in big.LITTLE when the big is another A50-series, especially if not an out-of-order, speculative, superscalar A57. But the new G7 units use an A73-based unit, which is a massive bump -- nearly 50% -- over even the A57, let alone A53.

No, it's not a 2.4-2.5GHz A73 or a lot of L2. But at 1.8GHz, the quad-A73 kicks butt. I cannot tell I'm using a mid-range phone. The Adreno 506 GPU is about the only time, but because it's only pushing 720p, it works well enough.

so-so camera. How's the camera quality on the G7?
I've been carrying dual-camera units for some time, but I really haven't been bothered by the quality. It's great for portraits and the front works well for selfies. But low-light is supposed to suck. The LED flash is subpar, and I noticed that in flashlight mode.

I was carrying an Essential PH-1 before, which had a killer camera, but the worst camera software. I tried the OnePlus 6 last year, and returned it as I had lots of issues on T-mobile.

And I refuse to go Google as they mark everything up 33%. I also had to factory reset almost every Nexus device I had after every major update. I know the Pixels should be better, but when the $299-399 (WiFi-only v. WiFi+LTE) nVidia Shield Tablet 2 became the $629 Pixel C Tablet in name (no redesign), it just twerked me.

nVidia has realized that breaking into retail is difficult, and they have to partner. Sad because their $299-399 'gaming' tablet in 2014 was the best tablet for any use in 2014, period, and had better specs and features than Google's more costly Nexus 9.
 
Last edited:
  • Dislike
Reactions: KnighttimeJoe
Yeah, but does it fold?
and break, that's also important.
Over 90% of devices are designed for consumer who don't know anything about good design and are more than willing to pay for things when they break.

This 'super thin' and other non-sense is part of the problem. Same for notebook designs. I'd rather have 'less deep' so I can use it on an airplane than 'thin.'

And 'super thin' is why devices overheat and have loud fans. That's gotta end.

One project I'm mid-prototype (but it's running/working with mock-ups of other components) is a 2-in-1 notebook that is ...
- Under 150mm (sub-6") -- no issue on any airplane usage
- UItra-wide allowing a 21:9 (2.33:1) display -- detachable w/rotate ('page view')
- Full 104 key keyboard and dual-optical trackballs (optical trackballs work great) on both the left and right (full-ambi)
And most of all ...
- Thick, probably 35-40mm (~1.5") so it can have 20-30mm thick fans at low RPM and move 3-5x as much air for 1/3-1/5th the noise.

In the prototype, I'm using actually 9:4 (2.25:1) using 3x 6" LCDs (4:3) broken out from a single DP port into 3x eDP. For those that don't know, DisplayPort (thank you AMD, curse you Intel for taking forever on DP) is designed for direct, embedded panels, unlike earlier designs (don't get me standard).

The prototype is using an off-the-shelf SFF Nano-ITX (120x120mm) board, but I'd probably have a similar size to a 3.5" type SBC (101x152mm / 4" x 6") custom made. I haven't kept up on the PCB commonality to know what size is best. AMD came up with DTX, including Mini-DTX (203x170 mm / 8" x 6.7"), years ago so they could maximize the area to get six (6) boards out of one board (406x510mm / 16" x 20" IIRC?) in lithography.

Oh, one bonus of the design and added height is that it allows for a 'standardized' ATX/DTX/ITX-compatible 'backplate' of ports at 158.75mm (6.25") wide, although only up to 25mm (1") instead of the 44.25mm (1.75") high of the ATX/DTX/ITX backplate. But at least it would make it easier to swap out boards that could be universally compatible. The thickness also allows a standardization around Module PCI Express Module (MXM) GPU cards as an option -- with better cooling too.

The one, big idea I have is a set of modular bays on both ends for ...
- AHCI/SATA or SAS legacy drives
- nVME/U.2 drives -- U.2 is the modular bay equivalent to M.2, including up to 4 PCIe channels, positional/pin-out (but not electrically compatible) with SATA
- Li-Ion Battery of +7.2V exchanging +19.2V using the same SATA/U.2 rails (read on)

The idea is to have 3x 2.5"x7mm bays where one can insert a 3x 5-7mm battery/drives or 1x 12-15mm battery/drive + 1 5-7mm battery/drive on each end, for up to six (6) modular bays. I'd either have to have a proprietary, stand-off 20.3V standoff or ...

Design the bay's pin-out +5V and +12V rails so they could handle the current of running off a single battery. I'm not sure that's possible at the gauge, and I haven't investigated it.

Most notebooks use +19-20V DC because it allows +12V discharge (powering the notebook) with +7.2V charge (charging the battery). What my idea to do is use the bay's +12V rail for discharge (powering the notebook), while the the +5V rail in SATA -- which is often not used at all in SATA or U.2 devices (just the +3.3V and +12V) -- to over-volt at +7.2V for charging. It would only connect at +5V, until the battery tells the notebook over the +5V TTL, "Hey, I'm a Li-Ion battery, send me +7.2V."

The bonus is that the +3.3V rail is still there, and can be used for standby power or other CMOS logic too.

Everyone likes the trackballs on the side corners in my hack-up, instead of that damn, front-center trackpad. It's cheap to not only have 2 small trackballs and buttons for both left and right users, but it's a fall-back in case one breaks. I'm surprised no one has done similarly to date. I was thinking I could do a 'modular' design where one could even put in a mini-joystick on both sides, but that would mean the panel would have to be 'inside' of it (reducing panel width -- or increasing size), or it couldn't close.

But I haven't had the time to do some custom development on the hardware-side of things.
 
Last edited:
  • Dislike
Reactions: KnighttimeJoe
ADVERTISEMENT
ADVERTISEMENT