@@ -43,7 +43,7 @@ TAP provides two operation modes:
43
43
Gaia TAP+ server provides two access modes:
44
44
45
45
* Public: this is the standard TAP access. A user can execute ADQL queries and
46
- upload tables to be used in a query 'on-the-fly' (these tables will be removed
46
+ upload votables to be used in a query 'on-the-fly' (these tables will be removed
47
47
once the query is executed). The results are available to any other user and
48
48
they will remain in the server for a limited time.
49
49
@@ -76,8 +76,13 @@ Examples
76
76
77
77
This query searches for all the objects contained in an arbitrary rectangular projection of the sky.
78
78
79
+ WARNING: This method implements the ADQL BOX function that is deprecated in the latest version of the standard
80
+ (ADQL 2.1, see: https://ivoa.net/documents/ADQL/20231107/PR-ADQL-2.1-20231107.html#tth_sEc4.2.9).
81
+
79
82
It is possible to choose which data release to query, by default the Gaia DR3 catalogue is used. For example::
80
83
84
+ .. doctest-remote-data ::
85
+
81
86
>>> from astroquery.gaia import Gaia
82
87
>>> Gaia.MAIN_GAIA_TABLE = " gaiadr2.gaia_source" # Select Data Release 2
83
88
>>> Gaia.MAIN_GAIA_TABLE = " gaiadr3.gaia_source" # Reselect Data Release 3, default
@@ -199,7 +204,7 @@ To load only table names metadata (TAP+ capability):
199
204
INFO: Retrieving tables... [astroquery.utils.tap.core]
200
205
INFO: Parsing tables... [astroquery.utils.tap.core]
201
206
INFO: Done. [astroquery.utils.tap.core]
202
- >>> for table in ( tables) :
207
+ >>> for table in tables:
203
208
... print (table.get_qualified_name())
204
209
external.external.apassdr9
205
210
external.external.catwise2020
@@ -361,11 +366,10 @@ Note: you can inspect the status of the job by typing:
361
366
1.5. Synchronous query on an 'on-the-fly' uploaded table
362
367
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
363
368
364
- A table can be uploaded to the server in order to be used in a query.
369
+ A votable can be uploaded to the server in order to be used in a query.
365
370
366
371
You have to provide the local path to the file you want to upload. In the following example,
367
- the file 'my_table.xml' is located to the relative location where your python program is
368
- running. See note below.
372
+ the file 'my_table.xml' is located to the relative location where your python program is running. See note below.
369
373
370
374
.. doctest-skip ::
371
375
@@ -563,7 +567,8 @@ Your schema name will be automatically added to the provided table name::
563
567
Job '1539932326689O' created to upload table 'table_test_from_url'.
564
568
565
569
Now, you can query your table as follows (a full qualified table name must be provided,
566
- i.e.: *user_<your_login_name>.<table_name> *)::
570
+ i.e.: *user_<your_login_name>.<table_name> *. Note that if the <table_name> contains capital letters, it must be
571
+ surrounded by quotation marks, i.e.: *user_<your_login_name>."<table_name>" *)::
567
572
568
573
>>> full_qualified_table_name = 'user_<your_login_name>.table_test_from_url'
569
574
>>> query = 'select * from ' + full_qualified_table_name
@@ -590,7 +595,8 @@ Your schema name will be automatically added to the provided table name.
590
595
Uploaded table ' table_test_from_file' .
591
596
592
597
Now, you can query your table as follows (a full qualified table name must be provided,
593
- i.e.: *user_<your_login_name>.<table_name> *)::
598
+ i.e.: *user_<your_login_name>.<table_name> *. Note that if the <table_name> contains capital letters, it must be
599
+ surrounded by quotation marks, i.e.: *user_<your_login_name>."<table_name>" *)::
594
600
595
601
>>> full_qualified_table_name = 'user_<your_login_name>.table_test_from_file'
596
602
>>> query = 'select * from ' + full_qualified_table_name
@@ -617,7 +623,8 @@ Your schema name will be automatically added to the provided table name.
617
623
618
624
619
625
Now, you can query your table as follows (a full qualified table name must be provided,
620
- i.e.: *user_<your_login_name>.<table_name> *)::
626
+ i.e.: *user_<your_login_name>.<table_name> *. Note that if the <table_name> contains capital letters, it must be
627
+ surrounded by quotation marks, i.e.: *user_<your_login_name>."<table_name>" *)::
621
628
622
629
>>> full_qualified_table_name = 'user_<your_login_name>.table_test_from_astropy'
623
630
>>> query = 'select * from ' + full_qualified_table_name
@@ -636,13 +643,14 @@ table named: user_<your_login_name>.'t'<job_id>::
636
643
>>> from astroquery.gaia import Gaia
637
644
>>> Gaia.login()
638
645
>>> j1 = Gaia.launch_job_async("select top 10 * from gaiadr3.gaia_source")
639
- >>> job = Gaia.upload_table_from_job(j1)
646
+ >>> Gaia.upload_table_from_job(job= j1)
640
647
Created table 't1539932994481O' from job: '1539932994481O'.
641
648
642
649
Now, you can query your table as follows (a full qualified table name must be provided,
643
- i.e.: *user_<your_login_name>.t<job_id> *)::
650
+ i.e.: *user_<your_login_name>."t<job_id>" *. Note that the previous table name must be
651
+ surrounded by quotation marks since it contains capital letters.)::
644
652
645
- >>> full_qualified_table_name = 'user_<your_login_name>.t1539932994481O '
653
+ >>> full_qualified_table_name = 'user_<your_login_name>."t1710251325268O" '
646
654
>>> query = 'select * from ' + full_qualified_table_name
647
655
>>> job = Gaia.launch_job(query=query)
648
656
>>> results = job.get_results()
@@ -750,12 +758,11 @@ The following example uploads a table and then, the table is used in a cross mat
750
758
751
759
Once you have your cross match finished, you can obtain the results::
752
760
761
+
753
762
>>> xmatch_table = 'user_<your_login_name>.' + xmatch_table_name
754
- >>> query = ('SELECT c."dist"*3600 as dist, a.*, b.* FROM gaiadr3.gaia_source AS a, '
755
- ... 'full_qualified_table_name+' AS b, '
756
- ... 'xmatch_table+' AS c '
757
- ... 'WHERE (c.gaia_source_source_id = a.source_id AND '
758
- ... 'c.my_sources_my_sources_oid = b.my_sources_oid)'
763
+ >>> query = (f"SELECT c.separation*3600 AS separation_arcsec, a.*, b.* FROM gaiadr3.gaia_source AS a, "
764
+ ... f"{full_qualified_table_name} AS b, {xmatch_table} AS c WHERE c.gaia_source_source_id = a.source_id AND "
765
+ ... f"c.my_sources_my_sources_oid = b.my_sources_oid")
759
766
>>> job = Gaia.launch_job(query=query)
760
767
>>> results = job.get_results()
761
768
@@ -855,16 +862,20 @@ The following example shows how to retrieve the DataLink products associated wit
855
862
>>> data_release = ' Gaia DR3' # Options are: 'Gaia DR3' (default), 'Gaia DR2'
856
863
>>> datalink = Gaia.load_data(ids = [2263166706630078848 , 2263178457660566784 , 2268372099615724288 ],
857
864
... data_release= data_release, retrieval_type= retrieval_type, data_structure= data_structure)
858
- >>> datalink.keys()
859
- dict_keys(['MCMC_GSPPHOT_COMBINED.xml', 'EPOCH_PHOTOMETRY_COMBINED.xml', 'RVS_COMBINED.xml', 'MCMC_MSC_COMBINED.xml', 'XP_SAMPLED_COMBINED.xml', 'XP_CONTINUOUS_COMBINED.xml'])
860
865
866
+ The DataLink products are stored inside a Python Dictionary. Each of its elements (keys) contains a one-element list that can be extracted as follows:
867
+ .. code-block :: python
868
+
869
+ >> > dl_keys = [inp for inp in datalink.keys()]
870
+ >> > dl_keys.sort()
871
+ >> > print (f ' The following Datalink products have been downloaded: ' )
872
+ >> > for dl_key in dl_keys:
873
+ ... print (f ' * { dl_key} ' )
861
874
862
875
.. Note ::
863
876
864
- It is not possible to search for and retrieve the DataLink products
865
- associated to more than 5000 sources in one and the same call.
866
- However, it is possible to overcome this limit programmatically using a
867
- sequential download, as explained in this tutorial _.
877
+ It is not possible to search for and retrieve the DataLink products associated to more than 5000 sources in one and the same call.
878
+ However, it is possible to overcome this limit programmatically using a sequential download, as explained in this tutorial _.
868
879
869
880
.. _tutorial : https://www.cosmos.esa.int/web/gaia-users/archive/datalink-products#datalink_jntb_get_above_lim
870
881
.. _DataLink : https://www.ivoa.net/documents/DataLink/
0 commit comments