We work on computer science problems that define the technology of today and tomorrow.
Advancing the state of the art
International Symposium on Experimental Robotics (2017)
Proceedings of Robotics: Science and Systems (RSS 2017) + Deep Learning for Action and Interaction workshop at NIPS (2016) + International Conference on Learning Representations (ICLR 2017) Workshop (2017)
Our teams aspire to make discoveries that impact everyone, and core to our approach is sharing our research and tools to fuel progress in the field.
Our researchers publish regularly in academic journals, release projects as open source, and apply research to Google products.
Explore our work
Researchers across Google are innovating across many domains. We challenge conventions and reimagine technology so that everyone can benefit.
Today, we’re excited to publish “The Building Blocks of Interpretability,” a new Distill article exploring how feature visualization can combine together with other interpretability techniques to understand aspects of how networks make decisions.
Heart attacks, strokes and other cardiovascular (CV) diseases continue to be among the top public health issues. Assessing this risk is critical first step toward reducing the likelihood that a patient suffers a CV event in the future.
Learn more about PAIR, an initiative using human-centered research and design to make AI partnerships productive, enjoyable, and fair.
The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems.
We generate human-like speech from text using neural networks trained using only speech examples and corresponding text transcripts.
We’re excited to release the TensorFlow model for processing Kepler Space Telescope data, training our neural network, and making predictions about new exoplanet candidate signals.
With motion photos, a new camera feature available on the Pixel 2 and Pixel 2 XL phones, you no longer have to choose between a photo and a video so every photo you take captures more of the moment.
Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud.
TensorFlow Lattice is a set of prebuilt TensorFlow Estimators that are easy to use, and TensorFlow operators to build your own lattice models.
Get to know Magenta, a research project exploring the role of machine learning in the process of creating art and music.