-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
- DataFS version: 0.7.0
- Python version: Python 3.4.5 :: Anaconda 4.2.0 (64-bit)
- Operating System: Ubuntu 14.04.4 LTS (GNU/Linux 3.13.0-85-generic x86_64)
Description
We can get inconsistent behavior if we are using the command line vs the python api. Specifically, environment variables (e.g. DATAFS_DEFAULT_PROFILE, DATAFS_CONFIG_FILE, and DATAFS_REQUIREMENTS_FILE) have no affect on the python api, whereas they do in the cli.
Example
I've set up a profile called test, in which I have a bunch of repos. The following is a valid search in the test profile, but not in my default profile (impactlab):
In the CLI:
$ datafs search project5 variable3 scenario1.nc team1 task3 # no response
$ datafs --profile test search project5 variable3 scenario1.nc team1 task3
team1_project5_task3_variable3_scenario1.nc
$ export DATAFS_DEFAULT_PROFILE=test
$ datafs search project5 variable3 scenario1.nc team1 task3 # use new default
team1_project5_task3_variable3_scenario1.nc
In the python API:
Using the same session (with the environment variable set to 'test'), we don't have the correct default profile:
>>> import os, datafs
>>> print(os.environ['DATAFS_DEFAULT_PROFILE']
'test'
>>> api = datafs.get_api() # get default api (should be test)
>>> list(api.search(
... 'project5', 'variable3', 'scenario1.nc', 'team1', 'task3')) # conduct same search
[]
>>> api2 = datafs.get_api('test') # force use of 'test'
>>> list(api2.search(
... 'project5', 'variable3', 'scenario1.nc', 'team1', 'task3')) # conduct same search
[u'team1_project5_task3_variable3_scenario1.nc']