Skip to content

Julia Computing in Action this Week at JuliaCon 2022

Discover key content authored by JuliaHub experts. Stay updated on innovative practices in scientific and technical computing.
JuliaHub

Attending JuliaCon? Julia Computing staff and users will be presenting at the virtual conference. For a full list of Julia Computing related presentations scroll down. Sign up to register for free https://juliacon.org/2022/

JuliaCon MeetUp - Bangalore

We’re hosting an in-person JuliaCon meetup at our Bangalore office on Friday July 29 starting at 5 pm IST. We will livestream all JuliaCon talks, chat over food and drinks, and meet other Julia enthusiasts from around Bangalore. Click here to RSVP

Wednesday July 27

oneAPI.jl: Programming Intel GPUs (and more) in Julia

Tim Besard

07/27/2022, 9:30 AM — 9:40 AM EDT

oneAPI.jl is a Julia package that makes it possible to use the oneAPI framework to program accelerators like Intel GPUs. In this talk, Tim will explain the oneAPI framework, which accelerators it supports, and demonstrate how oneAPI.jl makes it possible to work with these accelerators from the Julia programming language.

Multivariate Polynomials in Julia

Chris Elrod, Benoît Legat

07/27/2022, 10:40 AM — 8:50 AM EDT

Depending on the applications, the requirement for a multivariate polynomial library may be efficient computation of product, division, substitution, evaluation, gcd or even Gröbner bases. It is well understood that the concrete representation to use for these polynomials depends on whether they are sparse or not. In this talk, we show that in Julia, the choice of representation also depends on whether to specialize the compilation on the variables.

State of JuliaGeo

Josh Day, Rafael Schouten, Maarten Pronk

07/27/2022, 11:10 AM — 11:20 AM EDT

JuliaGeo is a community that contains several related Julia packages for manipulating, querying, and processing geospatial geometry data. We aim to provide a common interface between geospatial packages. In 2022 there has been a big push to have parity with the Python geospatial packages, such as rasterio and geopandas. In this 10 minute talk, we'd like to show these improvements–-both in code and documentation–-during a tour of the geospatial ecosystem.

Julia Computing Sponsored Talk

07/27/2022, 1:25 PM — 1:40 PM EDT

Learn about Julia Computing’s latest product developments that bring the power of Julia to commercial organizations.

Modeling a Crash Simulation System with ModelingToolkit.jl

Bradley Carman

07/27/2022, 4:00 PM — 4:30 PM EDT

Previously, traditional modeling tools were used to provide the acausal modeling framework which could be statically compiled and integrated with distributed software. But with this comes the dual language problem and friction with model research and development. With ModelingToolkit.jl, the tools needed to transition from traditional modeling frameworks are now available. This talk will cover our approach and success in re-writing our Hydraulic Crash Simulation system model in pure Julia.

Thursday, July 28

Julia Computing Sponsored Forum

07/28/2022, 1:25 PM — 1:40 PM EDT

Julia in VS Code - What's New

David Anthoff, Sebastian Pfitzner

07/28/2022, 4:00 PM — 4:30 EDT

We will highlight new features in the Julia VS Code extension that shipped in the last year and give a preview of some new work. The new features from last year that we will highlight are: 1) a new profiler UI, 2) a new table viewer UI, 3) a revamped plot gallery, 4) cloud indexing infrastructure, and 5) integration of JuliaFormatter. We will also present some brand-new features, in particular an entirely new test explorer UI integration.

Adaptive Radial Basis Function Surrogates in Julia

Ranjan Anantharaman

07/28/2022, 8:30 AM — 9:00 AM EDT

This talk focuses on an iterative algorithm, called active learning, to update radial basis function surrogates by adaptively choosing points across its input space. This work extensively uses the SciML ecosystem, and in particular, Surrogates.jl.

Lux.jl: Explicit Parameterization of Neural Networks in Julia

Avik Pal

07/28/2022, 9:00 AM — 9:20 AM EDT

Julia already has quite a few well-established Neural Network Frameworks including Flux & Knet. However, certain design elements – Coupled Model and Parameters & Internal Mutations – associated with these frameworks make them less compiler and user friendly. Making changes to address these problems in the respective frameworks would be too disruptive for users. To address these challenges, we designed Lux, a NN framework.

Pumas Sponsored Talk

07/28/2022, 9:55 AM — 10:00 AM EDT

Metal.jl - A GPU Backend for Apple Hardware

Tim Besard, Max Hawkins

07/28/2022, 12:40 PM — 12:50 PM EDT

In this talk, updates on the development of a GPU backend for Apple hardware (specifically the M-series chipset) will be presented along with a brief showcase of current capabilities and interface. The novel compilation flow will be explained and compared to the other GPU backends as well as the benefits and limitations of both a unified memory model and Apple's Metal capabilities. A brief overview of Apple's non-GPU hardware accelerators and their potential will also be discussed.

LinearSolve.jl: Because A\b Is Not Good Enough

Chris Rackauckas

07/28/2022, 12:50 PM — 1:00 PM EDT

Need to solve Ax=b for x? Then use A\b! Or wait, no. Don't. If you use that method, how do you swap that out for a method that performs GPU offloading? How do you switch between UMFPACK and KLU for sparse matrices? Krylov subspace methods? What does all of this mean and why is A\b not good enough? Find out all of this and more. P.S. LinearSolve.jl is the answer.

Build, Test, Sleep, Repeat: Modernizing Julia's CI Pipeline

Elliot Saba, Dilum Aluthge

07/28/2022, 10:10 PM — 10:20 PM EDT

Julia's Continuous Integration pipeline has struggled for many years now as the needs of the community have significantly outgrown the old Buildbot system. In this talk we will detail the efforts of the CI dev team to provide reliability, reproducibility, security, and greater introspective ability in our CI builds. These CI improvements aren't just helping the Julia project itself, but also other related open-source projects, as we continue to generate self-contained, useful building blocks.

Simple Chains: Fast CPU Neural Networks

Chris Elrod

07/28/2022, 1:40 PM — 1:50 PM EDT

SimpleChains is an open source pure-Julia machine learning library developed by PumasAI and JuliaComputing in collaboration with Roche and the University of Maryland, Baltimore. It is specialized for relatively small-sized models and NeuralODEs, attaining best in class performance for these problems. The performance advantage remains significant when scaling to tens of thousands of parameters, where it's still >5x faster than Flux or Pytorch using CPUs, and even outperforming GPUs.

Julia Computing Sponsored Forum

07/28/2022, 3:00 PM — 3:45 PM EDT

Optimizing Floating Point Math in Julia

Oscar Smith

07/28/2022, 3:30 PM — 4:00 PM EDT

Why did exp10 get 2x faster in Julia 1.6? One reason is, unlike most other languages, Julia doesn't use the operating system-provided implementations for math (Libm). This talk will be an overview of improvements in Julia's math library since version 1.5, and areas for future improvements. We will cover optimal polynomial computation, table based implementations, and bit-hacking for peak performance.

Using Optimization.jl to Seek the Optimal Optimizer in SciML

Vaibhav Dixit

07/28/2022, 3:40 PM — 4:10 PM EDT

Optimization.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means that when you learn one package, you learn them all! GalacticOptim.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.

Friday, July 29

JuliaSyntax.jl: A New Julia Compiler Frontend in Julia

Chris Foster

07/29/2022, 4:00 PM — 4:30 PM EDT

JuliaSyntax.jl is a new Julia language frontend designed for precise error reporting, speed and flexibility. In this talk we'll tour the JuliaSyntax parser implementation and tree data structures, highlighting benefits for users and tool builders. We'll discuss how to losslessly map Julia source text for character-precise error reporting and how a "parse stream" abstraction cleanly separates the parser from syntax tree creation while being 10x faster than Julia's reference parser.

The State of Julia in 2022

Viral B Shah

07/29/2022, 11:30 AM — 12:00 PM EDT

An update on Julia from the core development team.

Control-Systems Analysis and Design with JuliaControl

Fredrik Bagge Carlson

07/29/2022, 1:00 PM — 1:30 PM EDT

The Julia language is uniquely suitable for control-systems analysis and design. Features like a mathematical syntax, powerful method overloading, strong and generic linear algebra, arbitrary-precision arithmetics, all while allowing high performance, creates a compelling platform to build a control ecosystem upon. We will present the JuliaControl packages and illustrate how they make use of Julia to enable novel and sophisticated features while keeping implementations readable and maintainable.

How to Debug Julia Simulation Codes (ODEs, optimization, etc.!)

Chris Rackauckas

07/29/2022, 3:30 PM — 4:00 PM EDT

The ODE solver spit out dt

Scaling up Training of Any Flux.jl Model Made Easy

Dhairya Gandhi

07/30/2022, 4:00 PM — 4:30 PM EDT

In this talk, we will be discussing some of the state of the art techniques to scale training of ML models beyond a single GPU, why they work and how to scale your own ML pipelines. We will be demonstrating how we have scaled up training of Flux models both by means of data parallelism and by model parallelism. We will be showcasing ResNetImageNet.jl and DaggerFlux.jl to accelerate training of deep learning and scientific ML models such as PINNs and the scaling it achieves.

Recent Posts

Learn More

Want to learn more about our capabilities? We are here to help.