Code Monkey home page Code Monkey logo

net.jgp.books.spark.ch05's Introduction

The examples in this repository are support to the Spark in Action, 2nd edition book by Jean-Georges Perrin and published by Manning. Find out more about the book on Manning's website.

Spark in Action, 2nd edition - chapter 5

Welcome to Spark in Action, 2nd edition, chapter 5. This chapter is about deployment of your application on a cluster.

The example used here is the approximation of Pi.

This code is designed to work with Apache Spark v3.1.2.

Lab

Each chapter has one or more labs. Labs are examples used for teaching in the book. You are encouraged to take ownership of the code and modify it, experiment with it, hence the use of the term lab.

Lab #100

The PiComputeApp application does the following:

  1. It acquires a session (a SparkSession).
  2. It asks Spark to create a dataset using given List of numbers.
  3. It demonstrates how Spark's does Map and Reduce operations.

Running the lab in Java

For information on running the Java lab, see chapter 4 in Spark in Action, 2nd edition.

Running the lab using PySpark

Prerequisites:

You will need:

  • git.
  • Apache Spark (please refer Appendix P - "Spark in production: installation and a few tips").
  1. Clone this project
git clone https://github.com/jgperrin/net.jgp.books.spark.ch05
  1. Go to the lab in the Python directory
cd net.jgp.books.spark.ch05/src/main/python/lab100_pi_compute/
  1. Execute the following spark-submit command to create a jar file to our this application
spark-submit piComputeApp.py

Running the lab in Scala

Prerequisites:

You will need:

  • git.
  • Apache Spark (please refer Appendix P - "Spark in production: installation and a few tips").
  1. Clone this project
git clone https://github.com/jgperrin/net.jgp.books.spark.ch05
  1. Go to the lab directory
cd net.jgp.books.spark.ch05
  1. Package application using the sbt command
sbt clean assembly
  1. Run Spark/Scala application using spark-submit command as shown below:
spark-submit --class net.jgp.books.spark.ch05.lab100_pi_compute.PiComputeScalaApp target/scala-2.12/SparkInAction2-Chapter05-assembly-1.0.0.jar

News

  1. [2020-06-07] Updated the pom.xml to support Apache Spark v3.1.2.
  2. [2020-06-07] As we celebrate the first anniversary of Spark in Action, 2nd edition is the best-rated Apache Spark book on Amazon.

Notes

  1. [Java] Due to renaming the packages to match more closely Java standards, this project is not in sync with the book's MEAP prior to v10 (published in April 2019).
  2. [Scala, Python] As of MEAP v14, we have introduced Scala and Python examples (published in October 2019).
  3. The master branch contains the last version of the code running against the latest supported version of Apache Spark. Look in specifics branches for specific versions.

Follow me on Twitter to get updates about the book and Apache Spark: @jgperrin. Join the book's community on Facebook or in Manning's community site.

net.jgp.books.spark.ch05's People

Contributors

dependabot[bot] avatar jgperrin avatar rambabu-posa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

net.jgp.books.spark.ch05's Issues

Cluster called un in piComputeClusterApp.py

I wondered why I got an error on piComputeClusterApp.py. There's a clustername called spark://un:7077 in it, instead of local[]. I think you forgot to change it to local? I've tried the code with local[] and that works.

spark = SparkSession.builder.appName("PySparkPi on a cluster")
.master("spark://un:7077").getOrCreate()
#.master("local[*]").getOrCreate()

missing: square root?

First, thank you for the book! It's so great!

Second, and maybe I'm misunderstanding, but aren't we trying to take the square root below, not just their addition? As written, it seems closer to variance.

Shouldn't it be this?

return (sqrt(x * x + y * y) <= 1) ? 1 : 0;

After importing java.lang.Math, that change compiles but it doesn't seem to get the answer any closer to or further away from 3.14159..., but maybe it's the randomly created data set...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.