Skip to content

Julia Now Available on Google Colab

Anshul Singhvi
JuliaHub Blog Home Icon Gray

The Julia programming language has officially joined the list of supported runtimes in Google Colab, marking a significant milestone for both the Julia and Jupyter communities. With its high-performance capabilities, access to powerful GPUs and TPUs, and ease of use, Julia is now more accessible than ever, allowing users to take full advantage of Colab’s cloud-based environment. Keep an eye out on the Julia Discourse announcement thread for tips and tricks!


As announced by Google engineer Eric Johnson, “Julia is now available as a language in runtimes. In addition to providing easy access for the world to quickly begin programming in the awesome Julia language, since the first part of the Jupyter name itself is from the Julia language as the original triad of core supported languages (Julia (ju), Python (pyt) and R (r)), this is a bit of an homage to Jupyter itself, and completes this triad of supported languages available in Colab.”

What’s New on Google Colab?
Google Colab recently introduced several updates, one of them being native support for Julia. Previously, running Julia in Colab required workarounds like installing Julia manually each session. With this update, you can start coding in Julia immediately, without any hacks.

Now, you can use free T4 GPUs (compatible with CUDA.jl) to run Julia workloads, making this an excellent environment for high-performance computing, data science, and AI research. This is potentially the most exciting aspect of using Julia on Colab - anyone can use the power of a T4, L4 or A100 GPU directly from their laptop! Converting your function to run on the GPU can be as simple as using CUDA; gpu_data_array = cu(cpu_data_array).

Packages like CUDA, Reactant, and DiffEqGPU now work seamlessly in the new Julia runtime, enabling advanced machine learning and scientific computing applications. 

GPU computing can speed up parallel processing by orders of magnitude. Here’s an example (see this notebook) of how quickly we can compute an image of the Mandelbrot set, on CPU vs GPU:

What’s Included in the Julia Runtime?
To provide a smooth experience, Colab’s Julia runtime is based on the 1.10 LTS version. The runtime comes pre-installed with essential packages, including:

  • IJulia: Required for Jupyter/Colab compatibility.
  • CSV & DataFrames: Essential tools for data wrangling.
  • Makie & Plots: Powerful visualization libraries.

In addition, the team is exploring the possibility of pre-installing CUDA to enable GPU acceleration for Julia users in Colab. Follow the discussion on the Julia discourse.

Julia + Colab + Gemini: A Great Moment for Technical Computing

Julia’s availability in Google Colab aligns well with Google’s broader vision of making AI-powered data science more accessible. This is complemented by Gemini 2.0, Google’s latest AI assistant, which automates data analysis and is now available to users aged 18+ in select countries. By leveraging Julia’s speed and Colab’s cloud-powered environment, users can seamlessly integrate Gemini’s AI-driven insights into their workflows.

Julia just works with Gemini in Colab. You should end your prompts with “in Julia” to make sure the code that comes out is valid Julia code.


And here’s another example computing Pi on CPUs and GPUs (with the code generated by Gemini):

How to Get Started with Julia in Colab

  1. Open Google Colab
  2. Select “Runtime” > “Change runtime type”
  3. Choose “Julia” as your runtime

Prof. James Balamuta has a nice Getting started notebook that gives you a tour of Julia on Colab.

 




Recent Posts

Learn More

Want to learn more about our capabilities? We are here to help.