Need help with python-arango?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

joowani
353 Stars 46 Forks MIT License 123 Commits 2 Opened issues

Description

Python Driver for ArangoDB

Services available

!
?

Need anything else?

Contributors list

# 23,555
python-...
fish-sh...
Zsh
kafka
68 commits
# 102,264
Shell
q
Markdow...
test-da...
15 commits
# 519,399
Python
arangod...
4 commits
# 422,587
Python
Shell
Ruby
gitlab
1 commit
# 391,744
sqlite3
entitie...
HTML
Ember
1 commit
# 297,101
C++
C
arangod...
1 commit
# 566,968
Python
arangod...
Elixir
Databas...
1 commit
# 153,765
C
pymongo
odm
C++
1 commit
# 27,613
Raspber...
data-ex...
automat...
data-ex...
1 commit
# 474,485
arangod...
OCaml
Erlang
Shell
1 commit
# 641,934
Python
arangod...
1 commit
# 22,986
Python
openapi
swagger
starlet...
1 commit

Logo

Build CodeQL CodeCov PyPI version GitHub license Python version

Python-Arango

Python driver for ArangoDB, a scalable multi-model database natively supporting documents, graphs and search.

Requirements

  • ArangoDB version 3.7+
  • Python version 3.6+

Installation

pip install python-arango

Getting Started

Here is a simple usage example:

from arango import ArangoClient

Initialize the client for ArangoDB.

client = ArangoClient(hosts="http://localhost:8529")

Connect to "_system" database as root user.

sys_db = client.db("_system", username="root", password="passwd")

Create a new database named "test".

sys_db.create_database("test")

Connect to "test" database as root user.

db = client.db("test", username="root", password="passwd")

Create a new collection named "students".

students = db.create_collection("students")

Add a hash index to the collection.

students.add_hash_index(fields=["name"], unique=True)

Insert new documents into the collection.

students.insert({"name": "jane", "age": 39}) students.insert({"name": "josh", "age": 18}) students.insert({"name": "judy", "age": 21})

Execute an AQL query and iterate through the result cursor.

cursor = db.aql.execute("FOR doc IN students RETURN doc") student_names = [document["name"] for document in cursor]

Another example with graphs:

from arango import ArangoClient

Initialize the client for ArangoDB.

client = ArangoClient(hosts="http://localhost:8529")

Connect to "test" database as root user.

db = client.db("test", username="root", password="passwd")

Create a new graph named "school".

graph = db.create_graph("school")

Create vertex collections for the graph.

students = graph.create_vertex_collection("students") lectures = graph.create_vertex_collection("lectures")

Create an edge definition (relation) for the graph.

edges = graph.create_edge_definition( edge_collection="register", from_vertex_collections=["students"], to_vertex_collections=["lectures"] )

Insert vertex documents into "students" (from) vertex collection.

students.insert({"_key": "01", "full_name": "Anna Smith"}) students.insert({"_key": "02", "full_name": "Jake Clark"}) students.insert({"_key": "03", "full_name": "Lisa Jones"})

Insert vertex documents into "lectures" (to) vertex collection.

lectures.insert({"_key": "MAT101", "title": "Calculus"}) lectures.insert({"_key": "STA101", "title": "Statistics"}) lectures.insert({"_key": "CSC101", "title": "Algorithms"})

Insert edge documents into "register" edge collection.

edges.insert({"_from": "students/01", "_to": "lectures/MAT101"}) edges.insert({"_from": "students/01", "_to": "lectures/STA101"}) edges.insert({"_from": "students/01", "_to": "lectures/CSC101"}) edges.insert({"_from": "students/02", "_to": "lectures/MAT101"}) edges.insert({"_from": "students/02", "_to": "lectures/STA101"}) edges.insert({"_from": "students/03", "_to": "lectures/CSC101"})

Traverse the graph in outbound direction, breadth-first.

result = graph.traverse( start_vertex="students/01", direction="outbound", strategy="breadthfirst" )

Please see the documentation for more details.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.