Apple, unlike Google, or Facebook, or even Microsoft, is not a services company (as long-suffering iCloud/MobileMe/.Mac/iTools customs can attest), and so, to prescribe any sort of goodness to their decision to not retain user data is much less useful than an examination of what actually matters to their bottom line. And, as a hardware company, that means the supply chain. And that means people like Bibek Dhong.
Ben Thompson makes a very good point that it is not enough to judge the morality of a company like Apple just based on its privacy practice, which is primarily a byproduct of its business model and not a moral choice. It is much more appropriate to use metrics that are instead a reflection of its actual business, such as evaluating the supply chain.
Wikipedia is a fantastic resource, but some of the ways that it is implemented leaves a lot to be desired.
The main source of those problems is not mysterious. The loose collective running the site today, estimated to be 90 percent male, operates a crushing bureaucracy with an often abrasive atmosphere that deters newcomers who might increase participation in Wikipedia and broaden its coverage.
The National Security Agency has secretly circumvented or cracked much of the digital scrambling that protects global commerce, e-mails, phone calls, medical records and Web searches.
“Many dear friends wish I would shut the fuck up regarding the Gotham filter. It’s not happening because obsessing about the details not only continues to educate me, it also provides me the opportunity to form a well-constructed opinion. In a world where we mindlessly repeat the loudest and most compelling tweets as fact, a well-constructed opinion is rare. It’s rare because a well-constructed opinion can defend itself. Through a combination of experience, facts, and, occasionally, passion, a well-constructed opinion is a refreshing signal among a sea of unstructured, unattributed noise.”
– Rands in RIP Gotham
This is my favorite part of the proposal because it’s complete bullshit.
Dr. Drang takes a critical look, from an engineer’s perspective, at some portions of Musk’s Hyperloop proposal. The dismissal of thermal expansion of the tubes is especially hilarious.
“Simplicity is not the absence of clutter, that’s a consequence of simplicity. Simplicity is somehow essentially describing the purpose and place of an object and product. The absence of clutter is just a clutter-free product. That’s not simple.”
– Jony Ive in Jonathan Ive interview: simplicity isn’t simple
“I look forward to mornings because I love the way light hits the surface. Do I have issues? Probably.”
– Andrew Kim in Nokia Lumia 920 extended review
Smart critique of products like Glass and Home.
The same point, I think, applies to Google Glass. If you spend all day in the Googleplex, thinking googly thoughts about data ingestion and Now and the interest graph, then having ‘Google’ hovering in front of your eyes instead of rubbing on a phone seems like a really obvious progression. If everyone you know owns a Tesla and is deeply engrossed in new technology, then the idea that there might be social problems with Glass doesn’t come up - everyone’s too busy saying ‘AWESOME!’. In much the same way, no-one on the Facebook Home team seems to have realised that most people’s news feed isn’t full of perfectly composed photos of attractive friends on the beach.
Craig Hockenberry describes how he and Twitterrific helped create the word “tweet”, which is now included in the Oxford English Dictionary.
The most “intuitive” thing we know how to do–move our own bodies–reduced to an awkward, device-mediated pantomime: this is “getting technology out of the way”?
One of my biggest qualms about Google Glass is that it is prematurely hyped as being “the future of computing” by much of the media coverage. As part of a tactical species that carries out its the majority of its actions with hands and touch, I really don’t see how a computer attached to the head helps make computer interactions more logical and intuitive. A head mounted display of information can often be useful, but becoming the future of computing is a bit of a stretch.