Andrea Santurbano
Andrea Santurbano
## Guidelines The `neo4j-admin import` produces an odd import if we try to import data into a non-default db. - Neo4j version: 4.0.0 (via Neo4j Desktop) - Operating system: MacOs...
Draft...
Enable driver logging
With [SPARK-34952](https://issues.apache.org/jira/browse/SPARK-34952) released in version 3.2, Spark added aggregate pushdown; we can plan to support it in the connector.
For a use case like this: ```scala sparkSession.read.format("org.neo4j.spark.DataSource") .option("url", "bolt://localhost:7687") .option("authentication.type", "basic") .option("authentication.basic.username", "xxxxx") .option("authentication.basic.password", "xxxxx") .option("query", """ |match (coach:Customer{id:'1'}) |call my.custom.procedure('1') yield fieldA, fieldB, fieldC |value return fieldA, fieldB,...
Add step-by-step guide for Databricks cloud
It would be cool having a docker container in: https://hub.docker.com/r/neo4j/neo4j-experimental
In order to highlight how we export and import the data we can add a roundrip example like this: Export ``` val originalDf = spark.read.format("org.neo4j.spark.DataSource") .option("relationship", "ACTED_IN") .option("relationship.nodes.map", "false") .option("relationship.source.labels",...
I get this error when I try to import data. It says mapping is successful, I can see nodes and some relationships. ``` COMMAND: java -cp "C:\Users\Dell.Neo4jDesktop\graphApps_global\neo4j-etl-ui/dist/neo4j-etl.jar" org.neo4j.etl.NeoIntegrationCli export --mapping-file...
In this issue we collect the failed auto cherry-pick operations for branch `4.4`. Please don't edit or close it.