Skip to content

Commit a800ddf

Browse files
committed
Update comment
1 parent 883538e commit a800ddf

File tree

1 file changed

+4
-8
lines changed

1 file changed

+4
-8
lines changed

python/ibmos2spark/osconfig.py

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -190,10 +190,10 @@ def __init__(self, sparkcontext, credentials, configuration_name='', bucket_name
190190
in DSX - Notebooks by clicking on the datasources palette then
191191
choose the datasource you want to access then hit insert credentials.
192192
193-
cos_id [optional]: this parameter is the cloud object storage unique id. It is useful
194-
to keep in the class instance for further checks after the initialization. However,
195-
it is not mandatory for the class instance to work. This value can be retrieved by
196-
calling the get_os_id function.
193+
configuration_name [optional]: string that identifies this configuration. You can
194+
use any string you like. This allows you to create
195+
multiple configurations to different Object Storage accounts.
196+
if a configuration name is not passed the default one will be used "service".
197197
198198
bucket_name (projectId in DSX) [optional]: string that identifies the defult
199199
bucket nameyou want to access files from in the COS service instance.
@@ -202,10 +202,6 @@ def __init__(self, sparkcontext, credentials, configuration_name='', bucket_name
202202
If this value is not specified, you need to pass it when
203203
you use the url function.
204204
205-
NOTE: Hadoop configuration will be set with a service name equals to
206-
the value of cos_id. If cos_id is not set, the default service name "service"
207-
will be used.
208-
209205
'''
210206
self.bucket_name = bucket_name
211207
self.conf_name = configuration_name

0 commit comments

Comments
 (0)