Skip to main content
Ctrl+K
If you find this tutorial helpful, please star our GitHub repo!
Scalable Data Science with Python - Home Scalable Data Science with Python - Home
  • 中文版
  • GitHub
  • Scalable Data Science with Python
  • 1. Parallel Computing Basics
    • 1.1. Modern Computer Architecture
    • 1.2. Serial Execution v.s. Parallel Execution
    • 1.3. Threads and Processes
    • 1.4. Parallel Programming Design Methods
    • 1.5. Performance Metrics
  • 2. Data Science
    • 2.1. Data Science Lifecycle
    • 2.2. Machine Learning
    • 2.3. Deep Learning
    • 2.4. Hyperparameter Optimization
    • 2.5. Ecosystem and Content
  • 3. Dask
    • 3.1. Dask Overview
    • 3.2. Getting Started with Dask DataFrame
    • 3.3. Scaling Dask to a Cluster
    • 3.4. GPU
    • 3.5. Task Graph and Data Partitioning
  • 4. Dask DataFrame
    • 4.1. Reading and Writing Data
    • 4.2. Indexing
    • 4.3. map_partitions
    • 4.4. Shuffle
    • 4.5. Data Analysis with Dask
  • 5. Machine Learning with Dask
    • 5.1. Data Preprocessing
    • 5.2. Hyperparameter Tuning
    • 5.3. Distributed Machine Learning
  • 6. Ray
    • 6.1. Ray Overview
    • 6.2. Ray Remote Functions
    • 6.3. Distributed Object Storage
    • 6.4. Ray Remote Classes
  • 7. Ray cluster
    • 7.1. Ray Cluster
    • 7.2. Computing resources and resource groups
    • 7.3. Ray Job
  • 8. Ray Data
    • 8.1. Ray Data Overview
    • 8.2. Data Loading, Inspection, and Saving
    • 8.3. Data Transformation
    • 8.4. Preprocessor
    • 8.5. Modin
  • 9. Ray Machine Learning
    • 9.1. Ray Train
    • 9.2. Ray Tune
    • 9.3. Ray Serve
  • 10. Xorbits
    • 10.1. Xorbits Data
    • 10.2. Xinference
  • 11. MPI for Python
    • 11.1. MPI Overview
    • 11.2. MPI Hello World
    • 11.3. Point-to-Point Communication
    • 11.4. Collective Communication
    • 11.5. Remote Memory Access
  • 12. References
  • .md

Ray

6. Ray#

  • 6.1. Ray Overview
  • 6.2. Ray Remote Functions
    • Start a Ray Cluster
    • Example 1: Fibonacci Sequence
    • Differences Between Native Python Functions and Ray
    • Example 2: Monte Carlo Estimation of \(\pi\)
    • Example 3: Distributed Image Processing
  • 6.3. Distributed Object Storage
    • ray.put() and ray.get()
    • Example 1: Transforming Data
    • Passing Parameters
      • Automatic De-referencing
      • Complex Data Structures
    • Implementation
  • 6.4. Ray Remote Classes
    • Example 1: Distributed Counter
    • Actor Programming Model
    • Example 2: Leaderboard Ranking
    • Example 3: Actor Pool

previous

5.3. Distributed Machine Learning

next

6.1. Ray Overview

By Weizheng Lu

© Copyright 2023-2025.