spark-redis icon indicating copy to clipboard operation
spark-redis copied to clipboard

A connector for Spark that allows reading and writing to/from Redis cluster

Results 139 spark-redis issues
Sort by recently updated
recently updated
newest added

I am using the example provided in the Java docs and running this on a local spark cluster. ``` public void run() throws Exception { SparkSession sparkSession = SparkSession.builder() .master("local")...

## What did I do ```py df = ( spark .sql("SELECT 'test' AS key, 123 AS col_a, 223 AS col_b") ) ``` ```py ( df .write .format("org.apache.spark.sql.redis") .option("host", redis_host) .option("port",...

Storing key value pairs in redis without a table or a column names , and I am trying to build a df out of my data using the following code:...

When writing a dataframe with an array column, the value is saved as a string in the form of a "WrappedArray(x, y, z)". Since Redis supports nested data structures through...

Well, I'm a rookie. I want to access Redis in my spark project. Here is the Redis process running on my server. $ ps -ef|grep redis kam 7237 4438 0...

Adding sentinel support. This is working solution, we badly needed sentinels for our use case. WIth this fix its working in our organization

[Since redis 4.0](https://raw.githubusercontent.com/antirez/redis/4.0/00-RELEASENOTES), redis-cli accepts -u parameter, where an URL can be provided. Redis URL scheme is a well-known format (https://www.iana.org/assignments/uri-schemes/prov/redis) supported by most Redis drivers. ```python # snippet from...

I can't connect to an AWS MemoryDB cluster using SSL; I get an exception: `redis.clients.jedis.exceptions.JedisConnectionException`. Using the TablePlus app on macOS, I was able to connect with the same params...

I have built a DataFrame roughly has 2K records, and each record has a timezone field and a id field where id is about 240,000 string length. When I call...

Running into an NPE when trying to write keys to Redis, we recently upgraded to spark-redis:3.1.0-SNAPSHOT by building the JAR from the release tag and staging it on our DBFS...