Rinie Kervel
Rinie Kervel
Supporting node-gyp would make this a lot easier.
I have a rough Lilygo working with the other RF libraries… [“LilyGo_SSD1306”,“WebUI”,“RF”,“RF2”,“Pilight”,“rtl_433”] [github.com](https://github.com/rinie/OpenMQTTGateway/tree/ZradioSX127x) [GitHub - rinie/OpenMQTTGateway at ZradioSX127x](https://github.com/rinie/OpenMQTTGateway/tree/ZradioSX127x) [ZradioSX127x](https://github.com/rinie/OpenMQTTGateway/tree/ZradioSX127x) I isolated the decoding of the libraries from the receiving by...
> Fyi - rtl_433_esp and pilight libraries have conflicting names and can not be used together https://github.com/rinie/ESPiLight/tree/ZradioSX127x And https://github.com/rinie/rtl_433_ESP/tree/ZradioSX127x And https://github.com/rinie/OpenMQTTGateway/tree/ZradioSX127x And https://github.com/rinie/NewRemoteSwitch/tree/ZradioSX127x And https://github.com/rinie/rc-switch/tree/ZradioSX127x I modified this to fix...
I thought DIO2 could be used for Transmit as for Receiving... For now I added this to your rtl_433_ESP as that does the actual receiving. Just using the decoding of...
You try to read the data with hid_read_timeout. The code of olegstepura uses hid_get_feature_report report 0, 256 bytes, 1 extra for hid report handling. If you look at the hidapi...
Result should be read with hid_get_feature_report 0 256 bytes (257 for hidapi as it adds the reportnumber)
Loader examples seem messy. I looked at Snowflake/Postgress/Databricks examples and edit.csv.ts is a combination of a simple template javascript process.stdout.write(csvFormat and a SQL query. Why not factor out the query...
Thx. would you be OK if I added an evidence.dev loader so I could reuse my simple Oracle code? That way reusing their sql files separated from the 'connector' approach...
Thx for the examples but // Output the Apache Arrow table as an IPC stream to stdout. process.stdout.write(Arrow.tableToIPC(table)); And then npm build and then ending up in the cache as...
Ok my misunderstanding but is using stdout to chache not slower than a direct write? But **databricks** process.stdout.write(**csvFormat** from databricks instead of arrow/parquet or **Postgres** will be very slow for...