@@ -281,10 +281,6 @@ the data such as the files, their urls, sizes etc:
281
281
.. code- block:: python
282
282
.. doctest- remote- data::
283
283
>> > link_list = Alma.get_data_info(uids)
284
- >> > link_list[' content_length' ].sum()
285
- 538298369462
286
- >> > len (link_list)
287
- >> > 47
288
284
289
285
By default, ALMA data is delivered as tarball files. However, the content of
290
286
some of these files can be listed and accessed individually. To get information
@@ -293,8 +289,6 @@ on the individual files:
293
289
.. code- block:: python
294
290
.. doctest- remote- data::
295
291
>> > link_list = Alma.get_data_info(uids, expand_tarfiles = True )
296
- >> > len (link_list)
297
- >> > 50
298
292
299
293
You can then go on to download that data. The download will be cached so that repeat
300
294
queries of the same file will not re- download the data. The default cache
@@ -303,15 +297,15 @@ changing the ``cache_location`` variable:
303
297
304
298
.. code- block:: python
305
299
.. doctest- remote- data::
306
- >> > myAlma = Alma()
307
- >> > myAlma.cache_location = ' /big/external/drive/'
308
- >> > myAlma.download_files(link_list, cache = True )
300
+ >> > myAlma = Alma() # doctest: +SKIP
301
+ >> > myAlma.cache_location = ' /big/external/drive/' # doctest: +SKIP
302
+ >> > myAlma.download_files(link_list, cache = True ) # doctest: +SKIP
309
303
310
304
You can also do the downloading all in one step:
311
305
312
306
.. code- block:: python
313
307
.. doctest- remote- data::
314
- >> > myAlma.retrieve_data_from_uid(uids[0 ])
308
+ >> > myAlma.retrieve_data_from_uid(uids[0 ]) # doctest: +SKIP
315
309
316
310
If you have huge files, sometimes the transfer fails, so you will need to
317
311
restart the download. By default, the module will resume downloading where the
0 commit comments