VOOZH about

URL: https://github.com/Burla-Cloud

⇱ Burla Β· GitHub


Skip to content

πŸ‘ Image


πŸ‘ Image
πŸ‘ Image
πŸ‘ Image
πŸ‘ Image
πŸ‘ Image

Scale Python across 1,000 computers in 1 second.

Burla is an open-source cloud platform for Python developers. It only has one function:

from burla import remote_parallel_map

my_inputs = list(range(1000))

def my_function(x):
 print(f"[#{x}] running on separate computer")

remote_parallel_map(my_function, my_inputs)

This runs my_function on 1,000 VMs in the cloud, in < 1 second:

πŸ‘ Burla terminal demo showing remote_parallel_map running on 1,000 computers

The simplest way to build scalable data-pipelines.

Burla scales up to 10,000 CPUs in a single function call, supports GPUs, and custom containers.
Load data in parallel from cloud storage, then write results in parallel from thousands of VMs at once.

remote_parallel_map(process, [...])
remote_parallel_map(aggregate, [...], func_cpu=64)
remote_parallel_map(predict, [...], func_gpu="A100")

This creates a pipeline like:

πŸ‘ Burla data pipeline animation

Monitor progress in the dashboard:

Cancel bad runs, filter logs to watch individual inputs, or monitor output files in the UI.

πŸ‘ Burla dashboard demo

How it works:

With Burla, running code in the cloud feels the same as coding on your laptop:

return_values = remote_parallel_map(my_function, my_inputs)

When functions are run with remote_parallel_map:

  • Anything they print appears locally (and inside Burla's dashboard).
  • Any exceptions are thrown locally.
  • Any packages or local modules they use are (very quickly) cloned on remote machines.
  • Code starts running in under one second, even with millions of inputs or thousands of machines.

Features:

  • πŸ“¦ Automatic Package Sync
    Burla automatically (and very quickly) clones your Python packages on every remote machine where code is executed.

  • πŸ‹ Custom Containers
    Easily run code in any Docker container. Public or private, just paste an image URI in the settings, then hit start.

  • πŸ“‚ Network Filesystem
    Need to get big data into or out of the cluster? Burla automatically mounts a cloud storage bucket to a folder in every container.

  • βš™οΈ Variable Hardware Per-Function
    The func_cpu and func_ram args make it possible to assign big hardware to some functions, and less to others.

Try Burla for Free, using 1,000 CPUs!

  1. Sign in using your Google or Microsoft account.
  2. Run the quickstart in this Google Colab notebook (takes less than 2 minutes):
πŸ‘ Open Burla quickstart in Google Colab

Examples

Learn more at docs.burla.dev.

Popular repositories Loading

  1. burla Public

    The simplest way to scale Python.

    Python 239 12

  2. .github Public

Repositories

Showing 3 of 3 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

You can’t perform that action at this time.