Skip to content

Commit 439ce62

Browse files
authored
Documentation fixes for integration (#169)
1 parent 81fa058 commit 439ce62

File tree

2 files changed

+17
-12
lines changed

2 files changed

+17
-12
lines changed

docs/integrations/databricks/api_key.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -32,11 +32,11 @@ In Hopsworks, click on your *username* in the top-right corner and select *Setti
3232
```python hl_lines="6"
3333
import hsfs
3434
conn = hsfs.connection(
35-
'my_instance', # DNS of your Feature Store instance
36-
443, # Port to reach your Hopsworks instance, defaults to 443
37-
'my_project', # Name of your Hopsworks Feature Store project
38-
api_key_file='featurestore.key', # The file containing the API key generated above
39-
hostname_verification=True) # Disable for self-signed certificates
35+
host='my_instance', # DNS of your Feature Store instance
36+
port=443, # Port to reach your Hopsworks instance, defaults to 443
37+
project='my_project', # Name of your Hopsworks Feature Store project
38+
api_key_value='apikey', # The API key to authenticate with Hopsworks
39+
hostname_verification=True # Disable for self-signed certificates
4040
)
4141
fs = conn.get_feature_store() # Get the project's default feature store
4242
```

docs/integrations/spark.md

Lines changed: 12 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -31,13 +31,18 @@ Add the following configuration to the Spark application:
3131
spark.hadoop.fs.hopsfs.impl io.hops.hopsfs.client.HopsFileSystem
3232
spark.hadoop.hops.ipc.server.ssl.enabled true
3333
spark.hadoop.hops.ssl.hostname.verifier ALLOW_ALL
34-
spark.hadoop.hops.rpc.socket.factory.class.default io.hop.hadoop.shaded.org.apache.hadoop.net.HopsSSLSocketFactory");
34+
spark.hadoop.hops.rpc.socket.factory.class.default io.hop.hadoop.shaded.org.apache.hadoop.net.HopsSSLSocketFactory"
3535
spark.hadoop.client.rpc.ssl.enabled.protocol TLSv1.2
3636
spark.hadoop.hops.ssl.keystores.passwd.name material_passwd
3737
spark.hadoop.hops.ssl.keystore.name keyStore.jks
3838
spark.hadoop.hops.ssl.truststore.name trustStore.jks
39+
40+
spark.sql.hive.metastore.jars [Path to the Hopsworks Hive Jars]
41+
spark.hadoop.hive.metastore.uris thrift://[metastore_ip]:[metastore_port]
3942
```
4043

44+
`spark.sql.hive.metastore.jars` should point to the path with the Hive Jars which can be found in the *clients.tar.gz*.
45+
4146
## PySpark
4247

4348
To use PySpark, install the HSFS Python library which can be found on [PyPi](https://pypi.org/project/hsfs/).
@@ -76,16 +81,16 @@ In Hopsworks, click on your *username* in the top-right corner and select *Setti
7681

7782
## Connecting to the Feature Store
7883

79-
You are now ready to connect to the Hopsworks Feature Store from SageMaker:
84+
You are now ready to connect to the Hopsworks Feature Store from Spark:
8085

8186
```python
8287
import hsfs
8388
conn = hsfs.connection(
84-
'my_instance', # DNS of your Feature Store instance
85-
443, # Port to reach your Hopsworks instance, defaults to 443
86-
'my_project', # Name of your Hopsworks Feature Store project
87-
api_key_file='featurestore.key', # The file containing the API key generated above
88-
hostname_verification=True) # Disable for self-signed certificates
89+
host='my_instance', # DNS of your Feature Store instance
90+
port=443, # Port to reach your Hopsworks instance, defaults to 443
91+
project='my_project', # Name of your Hopsworks Feature Store project
92+
api_key_value='api_key', # The API key to authenticate with the feature store
93+
hostname_verification=True # Disable for self-signed certificates
8994
)
9095
fs = conn.get_feature_store() # Get the project's default feature store
9196
```

0 commit comments

Comments
 (0)