Good news, Apple and Google are partnering to implement system-level APIs for a privacy friendly contact tracing done via bluetooth (which, as we were discussing the other day, seems the most sensible approach):
Apple and Google will work to enable a broader Bluetooth-based contact tracing platform by building this functionality into the underlying platforms. This is a more robust solution than an API and would allow more individuals to participate, if they choose to opt in, as well as enable interaction with a broader ecosystem of apps and government health authorities. Privacy, transparency, and consent are of utmost importance in this effort, and we look forward to building this functionality in consultation with interested stakeholders.
The Contact Tracing Bluetooth Specification does not require the user’s location; any use of location is completely optional to the schema. In any case, the user must provide their explicit consent in order for their location to be optionally used.
The global spread of COVID-19 is affecting every one of us. At Apple, we are people first, and we do what we do with the belief that technology can change lives and the hope that it can be a valuable tool in a moment like this. Teachers are innovating to make remote lessons come alive. Companies are experimenting with new ways to stay productive. And medical experts can diagnose illnesses and reach millions with critical updates in the blink of an eye. We are all adapting and responding in our own way, and Apple wants to continue to play a role in helping individuals and communities emerge stronger.
But this global effort — to protect the most vulnerable, to study this virus, and to care for the sick — requires all of our care, and all of our participation. And I want to update you about the ways in which we are doing our part.
Twenty years ago (Jan 5th, 2020) Steve Jobs demoed Internet Explorer 5 for Mac. The app was chosen by Jobs for its bold UI, which was developed in complete secrecy within Microsoft but had an uncanny resemblance of the yet-to-be-unveiled Acqua interface of Mac OS X.
Maf Vosburgh, one of the developers who worked on the project, writes:
Coming from the artist-influenced multimedia world, the visual style Microsoft had in progress for Mac IE 5 looked ancient to me. Everything was the MacOS platinum style, shades of gray like cement, with a horde of tiny 16 by 16 pixel toolbar icons (in 4-bit color with a 1 bit mask) most of which had obviously been designed by engineers in a pixel editor like ResEdit.
I had the idea of making our browser chrome match the actual hardware you were on. If your Mac’s bezel was Bondi blue, we’d make our UI Bondi blue. That way our “frame” around the web page would match the bezel and so would be seen as part of the background and be distinct from the content. By being more vivid we would paradoxically blend into the background, and look more at home. […]
I put my idea to the rest of the Mac IE team, and they loved it. […] It rapidly came together and in Summer 1999 we demoed the secret New Look build of Mac IE5 to Steve Jobs, the first person to see it outside Nykris and a few people on the Mac IE team. Steve gave it his enthusiastic approval. Yeah!
So eventually MacWorld January 2000 came along, the venue for unveiling the Mac IE 5 beta. Steve Jobs insisted on doing the Mac IE 5 demo himself. Tnis is where things got a little surprising. Steve first showed a new build of Mac OS X which had a new user interface called “Aqua”. This looked, well, just like the Nykris design we’d been using for half a year at that point.
I never used OpenDoc, but I find the idea behind it interesting:
The core idea of OpenDoc is to create small, reusable components, responsible for a specific task, such as text editing, bitmap editing, or browsing an FTP server. OpenDoc provides a framework in which these components can run together, and a document format for storing the data created by each component. These documents can then be opened on other machines, where the OpenDoc frameworks substitute suitable components for each part, even if they are from different vendors.
Instead of recreating the same set of features within each app — forcing the user to learn different ways of doing the same thing —, the idea was to abstract the core functionality of each software to make it available across the OS.
I think iOS gets closer to that but it’s probably the internet, more than anything else, that which helped unbundling the data from the underlying software — turning the latter into a service.
Gli algoritmi per il riconoscimento facciale richiedono una potenza di calcolo che gli smartphone odierni, per quanto potenti, non hanno. Molti — Google, fra questi — caricano le foto online per poi analizzarle in remoto, con degli algoritmi di deep learning che girano su cloud.
Sul Machine Learning Journal, il team di “Computer Vision” di Apple racconta gli ostacoli che ha dovuto affrontare per riuscire a effettuare l’analisi delle facce sul dispositivo. iCloud cripta le foto in locale prima di caricarle sui suoi server; non è quindi possibile analizzarle altrove se non in locale:
We faced several challenges. The deep-learning models need to be shipped as part of the operating system, taking up valuable NAND storage space. They also need to be loaded into RAM and require significant computational time on the GPU and/or CPU. Unlike cloud-based services, whose resources can be dedicated solely to a vision problem, on-device computation must take place while sharing these system resources with other running applications. Finally, the computation must be efficient enough to process a large Photos library in a reasonably short amount of time, but without significant power usage or thermal increase.
Potrei cantare la stessa canzone, ma sul tasto Caps Lock. A volte anche altri tasti sembra siano prossimi ad incepparsi, ma con un po’ d’insistenza riprendono a funzionare. Anche il mio MacBook Pro ha poco meno di un anno.
Max Rudberg propone diversi accorgimenti per gestirlo al meglio, e farsene una ragione. Perché, del resto, Apple non vuole che lo si nasconda:
Apple writes in the HIG: “Don’t attempt to hide the device’s rounded corners, sensor housing, or indicator for accessing the Home screen by placing black bars at the top and bottom of the screen.
In the Designing for iPhone X video, posted by Apple after the X’s announcement, Mike Stern says: “Your app or game should always fill the display that it runs on. Placing black bars at the top or bottom of the screen makes your app feel small, cramped, and inconsistent with other apps on iPhone X”.
A questo punto, però, mi sembra ovvio che l’App Store rappresenti un problema: permette a governi di Paesi meno democratici di esercitare e forzare un controllo sull’OS. Come sottolinea John Gruber:
The App Store was envisioned as a means for Apple to maintain strict control over the software running on iOS devices. But in a totalitarian state like China (or perhaps Russia, next), it becomes a source of control for the totalitarian regime.
Andrew Hart, sviluppatore su iOS, ha sperimentato con ARKit e CoreLocation; il risultato è un nuovo modo di spostarsi per la città, guardandola attraverso lo smartphone (se non altro un miglioramento, seppur relativo, del fissare una mappa sul telefono).
Tyler Cowen sottolinea come nel rendere l’iPhone rivoluzionario abbiano giocato un ruolo rilevante anche varie e diverse innovazioni fatte di atomi; dai materiali utilizzati, al touch screen — importanti tanto quanto le app e il software:
The iPhone is behind the scenes a triumph of mining science, with a wide variety of raw materials and about 34 billion kilograms (75 billion pounds) of mined rock as an input to date, as discussed by Brian Merchant in his new and excellent book “The One Device: The Secret History of the iPhone.” A single iPhone has behind it the production of 34 kilos of gold ore, with 20.5 grams (0.72 ounces) of cyanide used to extract the most valuable parts of the gold.
Especially impressive as a material is the smooth touch-screen, and the user’s ability to make things happen by sliding, swiping, zooming and pinching it — the “multitouch” function. That advance relied upon particular materials, as the screen is chemically strengthened, made scrape-resistant and embedded with sensitive sensors. Multitouch wasn’t new, but Apple understood how to build it into a highly useful product.