We work on computer science problems that define the technology of today and tomorrow.
Advancing the state of the art
Nature, vol. 574 (2019), 505–510
Neural Information Processing Systems (2018)
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (2018)
Our teams aspire to make discoveries that impact everyone, and core to our approach is sharing our research and tools to fuel progress in the field.
Our researchers publish regularly in academic journals, release projects as open source, and apply research to Google products.
Explore our work
Researchers across Google are innovating across many domains. We challenge conventions and reimagine technology so that everyone can benefit.
An open-source quantum framework for building and experimenting with noisy intermediate scale quantum (NISQ) algorithms on near-term quantum processors.
Today, we’re excited to publish “The Building Blocks of Interpretability,” a new Distill article exploring how feature visualization can combine together with other interpretability techniques to understand aspects of how networks make decisions.
Heart attacks, strokes and other cardiovascular (CV) diseases continue to be among the top public health issues. Assessing this risk is critical first step toward reducing the likelihood that a patient suffers a CV event in the future.
Learn more about PAIR, an initiative using human-centered research and design to make AI partnerships productive, enjoyable, and fair.
We generate human-like speech from text using neural networks trained using only speech examples and corresponding text transcripts.
We’re excited to release the TensorFlow model for processing Kepler Space Telescope data, training our neural network, and making predictions about new exoplanet candidate signals.
With motion photos, a new camera feature available on the Pixel 2 and Pixel 2 XL phones, you no longer have to choose between a photo and a video so every photo you take captures more of the moment.
Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud.
TensorFlow Lattice is a set of prebuilt TensorFlow Estimators that are easy to use, and TensorFlow operators to build your own lattice models.
Get to know Magenta, a research project exploring the role of machine learning in the process of creating art and music.