Re: Android |OT| Beep-Boop
- Page 1 of 1
By brainchild Go To PostColor shifting when tilting the device and burn-in (like all phones with OLED screens). Considering how a phone is normally held, the first is a non issue and the second has always been the downside to using OLED in any phone or TV.Isn't there an issue with bad color reproduction. Like the settings are bad? Also just a single back camera is pretty disappointing considering most flagships are on dual cameras already. That bokeh/dof/dslr like effect on the Note 8 is pretty amazing, the software solution from Google can't really compete.
By Deadbeat Go To PostWhats a good small phone (around size of samsung s5 mini)? I was looking at a xperia compact. Even that isnt available in Canada so ill have to order it.Not many options in the 4.5 range. If you go to 5 there are plenty good phones out there.
By Kill Your Masters Go To PostIsn't there an issue with bad color reproduction. Like the settings are bad? Also just a single back camera is pretty disappointing considering most flagships are on dual cameras already. That bokeh/dof/dslr like effect on the Note 8 is pretty amazing, the software solution from Google can't really compete.
Not many options in the 4.5 range. If you go to 5 there are plenty good phones out there.
No. The screen calibration isn't as vibrant as some people would like, but that's just a matter of opinion, and more importantly, something that Google could change in an update if there was enough demand for it.
As for the camera, it is considered to be the overall best smartphone camera by nearly every reputable source who's reviewed it, and this is even ignoring the DXO analysis. Bokeh in smartphone cameras are a joke when compared to DSLR cameras, and they're all fake. Every last one of them is done through software.
Two cameras used in concert do not create Bokeh DOF. They're used to create a depth map for the subject in focus (by way of stereoscopy) and distinguish it from the background. Google does the same thing with the Pixel, but instead of creating the depth map from dual lenses, they create the depth map from dual pixel sensors. Admittedly, the pixel sensors have a smaller field of view, but the concept is exactly the same. Where the technologies differ is that with dual camaras, it's much easier to calculate the depth map for the focal point because of the physical distance between the two cameras, so you don't really need some complex algorithm to sort it out. With dual pixel sensors, the difference in perspective between the left and right sensors are so small that you'll have to do some mathematical extrapolation in combination with neural networking to figure out the depth of a focal point that is on a scale of several orders of magnitude larger.
After the depth maps get created, then the Bokeh is applied, through software. This is not how cameras like DSLRs create Bokeh DOF because they don't need to; Bokeh happens naturally on those cameras due to how the physics of light work in relationship to the focal length and f-stops of those cameras.
As the light rays enter the lens at the point of incidence, light rays coming from the camera's subject converge. Light rays coming from further away than the camera's subject hit the image sensor at different angles of incidence with respect to the focal point, causing those rays to blur to infinity the further the distance. This is pretty much how our eyes work as well. The difference with digital cameras is that their apertures are not perfectly circular sometimes, so out of focus light sources will sometimes take on different angular shapes, depending on the shape of the aperture.
Frankly, it is physically impossible for smartphones cameras to create natural DOF. Their apertures, focal lengths, and image sensors are too small by inherent design.
So now that we've established that smartphone cameras create DOF through software, we can get to the real heart of the issue with DOF on the Pixel. As I alluded to before, Google has to rely on their algorithmic experience to teach the software when and where to apply the DOF, and it's a lot harder doing that with just one camera, even if it does have dual pixel sensors. However, this is completely solvable over time through machine learning, and it's already doing a good job with human subjects. It just needs more data, which will come in time.
By brainchild Go To PostNo. The screen calibration isn't as vibrant as some people would like, but that's just a matter of opinion, and more importantly, something that Google could change in an update if there was enough demand for it.This is one of the most informative posts on the subject I've ever read. Perfectly explained as well.
(snip)
So now that we've established that smartphone cameras create DOF through software, we can get to the real heart of the issue with DOF on the Pixel. As I alluded to before, Google has to rely on their algorithmic experience to teach the software when and where to apply the DOF, and it's a lot harder doing that with just one camera, even if it does have dual pixel sensors. However, this is completely solvable over time through machine learning, and it's already doing a good job with human subjects. It just needs more data, which will come in time.
By Gattsu25 Go To PostThis is one of the most informative posts on the subject I've ever read. Perfectly explained as well.
Thanks.
Light physics (and science in general) are one of the things I'm really passionate about, so I always try to contribute to encouraging literacy on the subject whenever I can. Glad you were able to appreciate it.
When people understand the science behind the tech, they can parse through PR buzzwords and truly evaluate the pros and cons of the products on the market.
By brainchild Go To PostsnipThanks for the write up. I was aware of most of these things especially that in smartphone cameras, due to physical limitations they can't be like DSLRs, only emulate it with a lot of software processing. I was just thinking a dual lens setup makes these things better/easier. The hdr and generally what Google does with algorithms on processing a photo is stunning (I remember a video of mkbhd and how he showed a video where Googles software removed a person in front of a fence and the software managed to pretty much reproduce the background, wow)
.
It's all about bang for buck I guess. I just don't see the Pixel being that much better than lets say a S8 or N8 and we know these phones will be dirt cheap compared to a Pixel in 6 months or so.
Also I think they once again cheaped out on reusing hardware parts from HTC. Called it that the Pixel will be squeezable as soon as I read that HTC will make the hardware parts. It just feels as they cheaped out on the hardware part by reusing stuff from other manufacturers. I also admit I'm somewhat mad they abandoned the Nexus line of affordable phones so I'm biased in that regard.
Regarding the screen it seems they will recalibrate it in one of the next updates.
http://www.androidpolice.com/2017/10/26/google-will-update-pixel-2-2-xl-saturated-color-mode-says-xl-burn-not-issue/
By Kill Your Masters Go To PostThanks for the write up. I was aware of most of these things especially that in smartphone cameras, due to physical limitations they can't be like DSLRs, only emulate it with a lot of software processing. I was just thinking a dual lens setup makes these things better/easier. The hdr and generally what Google does with algorithms on processing a photo is stunning (I remember a video of mkbhd and how he showed a video where Googles software removed a person in front of a fence and the software managed to pretty much reproduce the background, wow)
It's all about bang for buck I guess. I just don't see the Pixel being that much better than lets say a S8 or N8 and we know these phones will be dirt cheap compared to a Pixel in 6 months or so.
Also I think they once again cheaped out on reusing hardware parts from HTC. Called it that the Pixel will be squeezable as soon as I read that HTC will make the hardware parts. It just feels as they cheaped out on the hardware part by reusing stuff from other manufacturers. I also admit I'm somewhat mad they abandoned the Nexus line of affordable phones so I'm biased in that regard.
Regarding the screen it seems they will recalibrate it in one of the next updates.
http://www.androidpolice.com/2017/10/26/google-will-update-pixel-2-2-xl-saturated-color-mode-says-xl-burn-not-issue/
Well, the Pixel also contains a pretty beastly SoC called the Pixel Visual Core, and it's just sitting dormant in every Pixel phone right now. A microelectronics chip with 3 [limited] Teraflops of computing power (Image processor with 512 ALUs, dedicate ARM core and memory), designed by Google from scratch can hardly be considered cheap when when you factor the cost of 2 SoCs in one phone. I'm pretty sure this is the first phone to do this, btw. But just because it isn't being used right now doesn't mean that it isn't adding cost to the device.
I understand that it doesn't necessarily LOOK like a high-end phone, but it definitely has the receipts to prove it.
And finally, I agree with you about Google's departure from mid-range phones like the Nexus brand (I used to own one). If you ask me, all high end phones cost way too much fucking money (they're all way above their bill of materials), but considering that people are perfectly happy to pay for them at those prices, I don't see the point in complaining about it anymore. Perception of value is a hell of a drug.
Oh, and glad Google's providing more calibration options for those who want it.