Google I/O 2017 – Highlights for the Android Developer

This year’s Google I/O 2017 was an exciting few days for developers. After last year’s complaints of long queues and no shade, Google listened and delivered a pre-booking system that ensured attendees weren’t queuing for hours. They also provided big bottles of sunscreen and more shade. 

I was fortunate to attend this year’s Google I/O in Mountain View at the Shoreline Amphitheater. The theme of this year’s conference changed from a mobile-first theme to an AI-first theme. There were talks on Machine Learning and TensorFlow but there were just as many talks on developer tools and platform improvements. Being a Google Conference, there were oodles of things to see and interact with. There were demos of a “mocktail mixer“, a smile powered candy dispenser, VR drone control and many more things to see. The After Hours parties, Women TechMakers event and the concert were heaps of fun too!

Being an Android Developer, these are a few of my top highlights from I/O 2017 and I’ve listed some of the talks that I think are worth watching. 


The Android team announced first class support for the programming language Kotlin. Kotlin has gained popularity in the Android community over the last few years and now it is an official language used to develop Android Apps.

The great part about this is that Kotlin works side-by-side with Java and C++, meaning you can add new code in Kotlin and keep existing code in Java. Kotlin reduces the amount of code you need to write which makes it a much more concise language. It can also help avoid null pointer exceptions as it introduces nullable types. It is already compatible with IntelliJ and Android Studio. There are many more features of Kotlin that I won’t dive into here. But in my opinion, I think this is defintely the biggest announcement from Google I/O.

The talks worth watching about Kotlin can be found here:

Check out the official Kotlin documentation.

Android Things, Smart Home and Cloud IoT Core

In case you missed this previously, Google announced a new platform called Android Things, which is Google’s branch into the Internet of Things (IoT). It allows you to write Android apps and use the Android ecosystem for IoT devices. At Google I/O, this was a big focus of the conference. There were many talks on IoT and Android Things. Codelabs for Android Things were available where a “Pro Maker Kit” was given away. This kit included a 5-inch Touch Screen, Pico i.MX7Dual Development board and other accessories.

Smart Home was announced which allows any developer to integrate with the Google Assistant Platform. Smart Home allows the Google Assistant to control your IoT device that you have built (i.e. a camera or light bulb). Google builds up a Home Graph about your home. It stores and provides contextual data about your home and devices. As an end user, you are able to perform actions with these devices that are now connected to the Home Graph. You are able to say things such as “Turn up the lights a little bit” and Google handles all the natural language processing and sends commands to your IoT device.

Cloud IoT Core was also announced at Google I/O 2017. It is a fully managed service provided by the Google Cloud Platform. Cloud IoT Core allows you to securely connect millions of devices to Google Cloud. It scales automatically and works everywhere in the world.

Talks to watch about IoT:


Another announcement from Google I/O was “TensorFlow Lite”. TensorFlow Lite is a Neural Network Library for Mobile Phones. It enables you to run artificial intelligence applications offline on your device. One of the codelabs shows you how to create an Android app that performs “Artistic Style Transfer” using TensorFlow. 

TensorFlow talk at I/O:

Android Architecture Components

One of the other big announcements was the Android Architecture Components Libraries. Previously the Android team refrained from giving advice as to how you should structure your Android applications. For the most part this meant that anyone learning Android for the first time would just end up placing all their code into the Activity files and occasionally moving stuff into an AsyncTask if the app crashed with a NetworkOnMainThreadException. Only after trying to add unit tests and instrumentation tests would you really understand that your code you have just spent so long developing was not easy to read, make changes to or to write tests for.

Now the Android Team has given guidelines for writing new applications on Android, these can be found here.

The new libraries include the following:

  • Room – A SQLite object mapper. Very similar to other libraries such as ORMlite or greenDAO. It uses SQL while still allowing compile time guarantees on the queries.
  • LiveData – A Lifecycle aware observable core component.
  • ViewModel – The communication points with the rest of the application for Activities / Fragments. They are UI code free and outlive the activity or fragment.
  • Lifecycle – A core part of the Architecture components, it contains information about the lifecycle state of a component (for instance an Activity).
  • LifecycleOwner – Core interface for components that have a Lifecycle(Activities, Fragments, Process, your custom component).
  • LifecycleObserver – Specifies what should happen when certain Lifecycle methods are triggered. Creating a LifecycleObserver allows components to be self-contained.

The talks that you should watch:

I’ve written a blog post about Room and LiveData with sample code.

Support Library Improvements

There are many improvements that have been added to the Android Support library in the latest release 26.0.0-beta1.

The support library has never been available on a maven repository, downloading new versions meant you had to manually download them (on every build server too). Now Google has their own maven repo! Simply add the following to your build.gradle to use it:

maven {
  url ""

My top changes in the support library that I am most excited about include:

  • Emoji Compatibility library – This means you no longer need to see Tofu emojis when certain emojis aren’t available on your platform. Yay for emojis!
  • Downloable Fonts and Font Support – In the support library you are now able to use custom fonts in your application without having to use third party libraries. It also means you don’t have to package common fonts in your apk file.

The talk you should watch:

Firebase New Announcements

The Firebase team has been hard at work and announced some new features to the Firebase platform. These features include:

  • The ability to authenticate with a mobile phone number. This feature is built using the same technology as Google’s mobile authentication system. It handles OTPs and resending of tokens for you. This feature is coming in June 2017.
  • Performance Monitoring – New tools are now available for you to monitor your app’s performance and network requests such as the response latency and the success rate. 
  • Android Test Lab now allows you to emulate different network connection speeds.

A good talk to watch about Firebase:

These are some of my highlights from this years Google I/O. Thanks Google for the great conference! I look forward to next year.