neo4j-spark-connector

by neo4j-contrib

neo4j-contrib / neo4j-spark-connector

These are the beginnings / experiments of a Connector from Neo4j to Apache Spark using the new binar...

223 Stars 82 Forks Last release: 2 months ago (2.4.5-M2) Apache License 2.0 274 Commits 8 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

Neo4j Connector for Apache Spark

This repository contains the Neo4j Connector for Apache Spark.

License

This neo4j-spark-connector is Apache 2 Licensed

Generating Documentation from Source

cd doc
# Install NodeJS dependencies
npm install
# Generate HTML/CSS from asciidoc
./node_modules/.bin/antora docs.yml
# Start local server to browse docs
npm run start

This will open http://localhost:8000/ which will serve development docs.

Building

Build

target/neo4j-spark-connector-4.0.0.jar
for Scala 2.11
mvn clean package

Integration with Apache Spark Applications

spark-shell, pyspark, or spark-submit

$SPARK_HOME/bin/spark-shell --jars neo4j-spark-connector-4.0.0.jar

$SPARK_HOME/bin/spark-shell --packages neo4j-contrib:neo4j-spark-connector:4.0.0

sbt

If you use the sbt-spark-package plugin, in your sbt build file, add:

scala spDependencies += "neo4j-contrib/neo4j-spark-connector:4.0.0"

Otherwise,

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
libraryDependencies += "neo4j-contrib" % "neo4j-spark-connector" % "4.0.0"

maven
In your pom.xml, add:

  
  
    neo4j-contrib
    neo4j-spark-connector
    4.0.0
  


  
  
    SparkPackagesRepo
    http://dl.bintray.com/spark-packages/maven
  

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.