Skip to content

Categories:

Facebook And Internet.org Detail “1000X” Technologies They Hope Will Bring Earth Online

Internet.org Feature

Air Traffic Control, HipHop, WebP, and Supplemental Downlink are some of the futuristic technologies that Facebook and its Internet.org partners will deploy to bring the Internet to the five billion people still not connected. A 70-page whitepaper released today by Internet.org partners Facebook, Qualcomm, and Ericsson details how spectrum must change to accommodate 1000 times more web traffic, and Facebook Home’s role as a data efficiency experiment.

Last month Facebook, Qualcomm, and Ericsson along with Samsung, MediaTek, Nokia, and Opera launched Internet.org, a partnership aimed at making the Internet accessible and affordable for everyone on the planet. At the time, it briefly discussed how network, data compression, and app efficiency technologies would all need to come together to make the web cheap enough to connect the whole world.

Today, Facebook and Qualcomm went a step further with this detailed whitepaper that outlines specific accessibility technologies they’re already testing, and those they plan to build. Ericsson then provides some tips to the mobile industry for understanding what its customers really want.

You can read the “A Focus On Efficiency” .PDF here, or check out our embed below.

Facebook’s Obsession With Efficiency

Facebook was getting serious about accessibility initiatives long before the launch of Internet.org.

As the company has grown from the $85 server it was first launched on, it’s searched for ways to make its service more efficient. It launched the HipHop translator for PHP years ago so employees could code in an easy language but have their work transformed into the much more server-efficient C++ language. This let it run 50% more traffic per server, but that wasn’t enough. It built and open-sourced the HipHop Virtual Machine execution engine and achieved a 500% increased in server throughput.

Meanwhile, it launched the Open Compute Project to help everyone build greener servers and data centers, with a focus on cooling, power transformations and “a “vanity free” DIY server design. Open Compute technologies have made Facebook’s Luleå, Sweden data center one of the most efficient in the world. Facebook now houses 250 billion photos (that’s a quarter trillion — a new statistic), more than 250 petabytes of data, and takes in over a half petabyte of new data each day without stumbling.

But what’s most newsworthy and fascinating about the whitepaper the deep look into how Facebook is currently experimenting with the future of data efficiency.

To more easily test data efficiency and stability across the wide range of connection types found around the world, Facebook created Air Traffic Control. It’s a system designed “to help engineers simulate different network conditions right inside Facebook’s offices. Aspects that can be controlled include band- width, latency, packet loss, corrupted packets, and packet ordering.”

Facebook explains that Air Traffic Control lets it test different mobile radio technologies like @G, EDGE, 3G, and 4G over wi-fi; simulate what it’s like using Facebook’s apps in the network conditions of countries like India, and see how different network capacity and congestion should as peak usage impact the user experience and data connections.

Facebook is also transitioning in part to Google’s WebP efficient digital image format. Photos are the number one source of data usage on Facebook, and cutting this down could make it more profitable while allowing it to include some photos for people on a data budget. Facebook says “At this point, most of our images are converted into WebP for the Android application, and our goal is to roll out WebP to other platforms as well. When the images are converted to WebP, this will save over 20% of total network traffic, without loss of quality.”

Though we’re scheduled to see more smartphones than features phones in the world in 2015, it will still take a long time to phase out everyone’s dumber devices. So Facebook has been focusing some of its accessibility development on Facebook For Every Phone, the stripped down app that runs on devices currently in the hands of many of the 5 billion it hopes to connect to the web.

Since feature phones have severely limited processing power, Facebook has worked to handle as much computation as possible on server-side. It also has built Facebook For Every Phone to minimize transfers from the server and reuse as much cached content as possible.

Facebook Home, Accessibility Lab In Disguise

Facebook has also been using the Home ‘apperating system’ it launched on Android in April as Guinea pig cage for Internet.org-related technologies. While Home has failed to gain much traction and CEO Mark Zuckerberg said he was disappointed by that, at least it’s aiding Facebook with research and development of accessibility technology.

For example, since Home tries to always have fresh photo content waiting in the lockscreen Cover Feed, Facebook worked to avoid silently running up a huge data bill for users.

So Home detects whether a user is on a wifi or mobile connection and determines whether to pull down higher or lower resolution photos. When it discovers the device is connected to wifi, “Facebook begins aggressively prefetching and caching images. This means that a device builds up an inventory of photos that it can rely on when data is no longer plentiful.” Facebook also adjusts how frequently it fetches text-based data depending on a user’s connection.

These improvements are already active in Home, and Facebook says they “are expected to be brought to other Facebook applications soon.” One day these technologies could let people in the developing world have a richer experience by only sucking in tons of data when it’s free over wifi.

Facebook Home uses intelligent caching to avoid redundant image downloads, and supports exporting the cache to a removable SD card to free up space in a device’s internal memory. That could be a win for accessibility initiatives because many phones in the developing world come with very little internal memory.

Home also puts a virtual cap on total data usage so that users don’t suddenly cost users a ton of money if they don’t realize they’re doing something data intensive. A similar cap could ensure developing world users still have data left for the most critical services like communication, and don’t blow it downloading a photo.

Battery life could also be an accessibility issue, as some parts of the world don’t have easy, cheap, reliable access to electricity for charging. Since wifi uses less power than a mobile connection, it can help people around the world keep their phones from going dead.

For now, avoiding background power over-use was a big priority for the launch of Facebook home. It tries to fetch News Feed stories from wifi whenever possible, and minimize “radio wakeups” by batching data pull-downs. Facebook explains that waking up a device’s network connection radio can burn .02% to 0.1% of a device’s total battery, even on devices running Ice Cream Sandwich or even newer operating systems. That’s why when it worked with HTC to build the HTC First “Facebook Phone” that comes pre-installed with home, it implemented shorter network time outs so the radio would go into power-saving stand-by mode more quickly.

In other hardware innovation, Facebook worked with GPU vendors “to tune the workload so that the power draw kept the Application Processor in optimal power mode while not compromising the experience on the device.” For example, rather than GPU composing Chat Heads so the chat feature can run overlaid on other apps, the back-end hardware does the composition more efficiently

Facebook also has a power measurement testing lab to experiment with power consumption in different situations. This lets it catch and expel regressions that would cause a device to fail to go to sleep properly.

By working to make Home more data and battery efficient, Facebook is laying the groundwork for Internet.org to create a mobile ecosystem where everyone has a device with cheap data and plenty of power, no matter where they live or how much money they have.

Qualcomm Demands Spectrum Reallocation

Later in the whitepaper, Qualcomm outlines what it calls the “1000X Challenge”. If data usage doubles every year, in a decade we’ll need the capacity to support 1000 times more traffic than today. Qualcomm is challenging itself to build the technologies necessary to achieve that capacity.

Some of the innovations it’s quoted as working on include:

  • Carrier Aggregation and Supplemental Downlink to bond together separate bands for more capacity and faster data speeds for consumers
  • LTE-Broadcast for multi-casting of video and data in places where many people want to see the same content
  • LTE-Direct to allow first responders and others to communicate device-to-device even if the cell network is down
  • 802.11ac and ad for faster Wi-Fi and other unlicensed applications
  • DSRC, which enables cars to communicate with one another to avoid collisions; and, a next-generation system to provide broadband for airplane passengers

Another crucial technology will be licensed “small cells” or low-powered radio access nodes with a range of about 10 meters. These can be integrated in wireless network and placed indoors to create a “hetnet” , or heterogrenous network of cells of different sizes.

But the most important building block to succeeding in the 1000X Challenge will be a redistribution of wireless spectrum. The industry will need a ton more spectrum to accommodate the world. Qualcomm proposes that new bands be cleared and auctioned off for industry use. One band specifically that could be repurposed with the 3.5GHz band allocated to the U.S. Government. It could support small cells, but still have part of reserved for use by the government when needed.

Ericsson concludes the whitepaper by citing a large survey regarding what people want out of wireless connectivity. It champions consistent Internet connection, fast speeds, and few crashes rolled into a service where the user doesn’t need to know anything about the cloud.

While it could be years for the technologies described by Facebook and Qualcomm to trickle down to the unconnected corners of the earth, its important that they’re cranking on the development process now. If they run out of users to sign up become no one else can afford a data connection, their businesses will falter. But while the Internet.org mission does admit it’s trying to create more profitable mobile companies, it’s not what defines the project.

Internet.org is powered by the belief that connectivity is a human right. With the knowledge brought by the Internet comes empowerment, compassion, and financial stability. Internet.org’s partners have yet to go into detail about exactly how those factors will boost the world economy and not just their own bottom lines. But the idea is that Internet access brings productivity that increases everyone’s output.

If tech giants can focus on accessibility for everyone now, one day we might achieve a thriving, connected global society previous generations couldn’t dream of.

Read more : Facebook And Internet.org Detail “1000X” Technologies They Hope Will Bring Earth Online

Posted in Uncategorized.


0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.



Some HTML is OK

or, reply to this post via trackback.