In the world of flagship smartphones, there seems to be one clear trend: bigger is better.
Manufacturers are trying to strip away anything that might stand in the way of the largest possible slab of screen. There is also growing demand for thinner phones with diminishing bezels (the area surrounding a screen).
This trend has now culminated in the latest innovation in smartphone design, the foldable screen phone. These devices sport thin OLED self-illuminating screens that can be folded in half.
The newest release is the Samsung Galaxy Z fold 2 – a device that is almost three-quarters screen and has extravagant overtones rivaled only by a hefty A$2,999 price tag.
But to prevent the phones from growing to an unwieldy size, manufacturers are having to find ways to balance size with usability and durability. This presents some interesting engineering challenges, as well as some innovative solutions.
Internal design complexities of folding phones
Modern phones still typically use a thin LCD or plastic OLED display covered by an outer glass panel.
Folding displays are a new category that exploits the flexibility of OLED display panels. Instead of simply fixing these panels to a rigid glass panel, they carefully engineer the panel so that it bends – but never quite tightly enough to snap or crack.
Internal structural support is needed to make sure the panel doesn’t crease or isn’t stressed to the point of creating damage, discoloration, or visible surface ripples.
[Read:
Since this is a mechanical, moving system, reliability issues need to be considered. For instance, how long will the hinge last? How many times can it be folded and unfolded before it malfunctions? Will dirt or dust make its way into the assembly during daily use and affect the screen?
Such devices need an added layer of reliability over traditional slab-like phones, which have no moving parts.
Large screen, thin phone: a recipe for disaster?
Each generation of smartphones becomes thinner and with smaller bezels, which improves the viewing experience but can make the phone harder to handle.
In such designs, the area of the device you can grip without touching the display screen is small. This leads to a higher chance of dropping the device – a blunder even the best of us have made.
There’s an ongoing tussle between consumers and manufacturers. Consumers want a large, viewable surface as well as an easily portable and rugged device. But from an engineering point of view, these are usually competing requirements.
You’ll often see people in smartphone ads holding the device with two hands. In real life, however, most people use their phone with one hand.
Thus, the shift towards larger, thinner phones has also given rise to a boom in demand for assistive tools attached to the back, such as pop-out grips and phone rings.
In trying to maximize screen size, smartphone developers also have to account for interruptions in the display, such as the placement of cameras, laser scanners (for face or object identification), proximity sensors, and speakers. All are placed to minimize visual intrusion.
Now you see it, now you don’t
In the engineering world, to measure the physical world you need either cameras or sensors, such as in a fingerprint scanner.
With the race to increase the real estate space on screens, typically these cameras and scanners are placed somewhere around the screen. But they take up valuable space.
This is why we’ve recently seen tricks to carve out more space for them, such as pop up cameras and punch-hole cameras, in which the camera sits in a cutout hole allowing the display to extend to the corners.
But another fantastic place for sensors is right in front of us: the screen. Or more specifically, under the screen.
Samsung is one company that has suggested placing selfie-cameras and fingerprint readers behind the screen. But how do you capture a photo or a face image through a layer of screen?
Up until recently, this has been put in the “too hard basket.” But that is changing: Xiaomi, Huawei, and Samsung all have patents for under-display cameras.
There is a range of ways to do this, from allowing a camera to see through the screen, to using microlenses and camera pixels distributed throughout the display itself – similar to an insect’s compound eye.
In either case, the general engineering challenge is to implement the feature in a way that doesn’t impact screen image quality, nor majorly affect camera resolution or color accuracy.
Laptops in our pockets
With up to 3.8 billion smartphone users expected by 2021, mobile computing is a primary consumer technology area seeing significant growth and investment.
One driver for this is the professional market, where larger mobile devices allow more efficient on-the-go business transactions. The second market is individuals who only have a mobile device and no laptop or desktop computer.
It’s all about choice, but also the functionality. Whatever you choose has to get the job done, support positive user experience, but also survive the rigors of the real world.
This article is republished from The Conversation by Andrew Maxwell, Senior Lecturer, University of Southern Queenslandunder a Creative Commons license. Read the original article.
For more gear, gadget, and hardware news and reviews, follow Plugged on Twitter and Flipboard.
Published September 18, 2020 — 16:00 UTC