Skip to content

Commit ad576ea

Browse files
authored
Merge pull request #384 from aperture-data/release-0.4.19
Release 0.4.19
2 parents ff75c34 + 7c848ea commit ad576ea

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+434
-235
lines changed

aperturedb/BBoxDataCSV.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ class BBoxDataCSV(CSVParser.CSVParser):
1515
**ApertureDB BBox Data.**
1616
1717
This class loads the Bounding Box Data which is present in a CSV file,
18-
and converts it into a series of aperturedb queries.
18+
and converts it into a series of ApertureDB queries.
1919
2020
:::note Is backed by a CSV file with the following columns:
2121
``IMG_KEY``, ``x_pos``, ``y_pos``, ``width``, ``height``, ``BBOX_PROP_NAME_1``, ... ``BBOX_PROP_NAME_N``, ``constraint_BBOX_PROP_NAME_1``

aperturedb/BlobDataCSV.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ class BlobDataCSV(CSVParser.CSVParser):
1313
"""**ApertureDB Blob Data.**
1414
1515
This class loads the Blob Data which is present in a CSV file,
16-
and converts it into a series of aperturedb queries.
16+
and converts it into a series of ApertureDB queries.
1717
1818
:::note Is backed by a CSV file with the following columns:
1919
``FILENAME``, ``PROP_NAME_1``, ... ``PROP_NAME_N``, ``constraint_PROP_NAME_1``

aperturedb/BlobNewestDataCSV.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ class BlobNewestDataCSV(CSVParser.CSVParser):
1515
Update an Entity which has an associated blob to the data in the CSV
1616
What this means is:
1717
- If it doesn't exist, add it.
18-
- If it exsits and the blob hasn't changed, update it.
18+
- If it exists and the blob hasn't changed, update it.
1919
- If it exists and the blob has changed, delete and re-add it.
2020
2121
This means if these elements are part of a graph where they are linked by connections
@@ -29,15 +29,15 @@ class BlobNewestDataCSV(CSVParser.CSVParser):
2929
This class utilizes 3 conditionals
3030
- normal constraint_ to select the element
3131
- a series of updateif_ to determine if an update is necessary
32-
- one or more prop_ and the assocaited updateif blob conditonals
32+
- one or more prop_ and the associated updateif blob conditionals
3333
to determine if a update or an delete/add is appropriate
3434
3535
Generated fields
3636
Format is: gen_<type>_name
3737
3838
type:
3939
- blobsha1 - the sha1 for the blob is calculated
40-
- blobsize - the length in bytes of the blob is calcualte
40+
- blobsize - the length in bytes of the blob is calculate
4141
- insertdate - ISO Format of date ( this will always change! )
4242
4343
the result is then used to identify if a blob has changed.
@@ -169,7 +169,7 @@ def filter_generated_constraints(self, return_generated=False):
169169
filtered.append(key)
170170
return filtered
171171

172-
# creat generated constraints for specific index
172+
# create generated constraints for specific index
173173
# match controls if constrain will be a positive selection (True) or negative(False)
174174
def create_generated_constraints(self, idx, match=True):
175175
constraints = {}
@@ -218,7 +218,7 @@ def getitem(self, idx):
218218
# process is; add if not existing ( # Pt 1 )
219219
# if existing
220220
# if blob checks pass ( or there are none ) - update metadata ( Pt 2 )
221-
# if blobk check FAILS
221+
# if blob check FAILS
222222
# delete ( Pt 3 )
223223
# re-add ( Pt 4 )
224224

aperturedb/CSVParser.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,11 +30,11 @@ class CSVParser(Subscriptable):
3030
- **Dask Mode**: This mode is used when the CSV file is too big to fit in memory, or multiprocessing is desired.
3131
It reads the CSV file into a Dask DataFrame.
3232
In Dask mode the CSV file is read in chunks, and the operations are performed on each chunk.
33-
The tricky bit is that the chunck size is not known till the loader is created, so the processing happens when ingest is called.
33+
The tricky bit is that the chunk size is not known till the loader is created, so the processing happens when ingest is called.
3434
So the Data CSV has another signature, where the df is passed explicitly.
3535
3636
Typically, the response_handler is application specific, and loading does not break
37-
on errors in response_handlers, so the default behaviour is to log the error and continue.
37+
on errors in response_handlers, so the default behavior is to log the error and continue.
3838
If you want to break on errors, set strict_response_validation to True.
3939
"""
4040

aperturedb/Configuration.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
@dataclass(repr=False)
55
class Configuration:
66
"""
7-
**Configuration object for aperturedb sdk to be able to connect to ApertureDB**
7+
**Configuration object for ApertureDB sdk to be able to connect to ApertureDB**
88
"""
99
host: str
1010
port: int

aperturedb/ConnectionDataCSV.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,9 @@ class ConnectionDataCSV(CSVParser):
2828
2929
**ConnectionClass**: Arbitrary class name for the entity this would be saved as.
3030
31-
**``ClassName``@``PropertyName``**: This is a special combination of Class Name and Property Name that can uniquely identify an entity. ‘@’ is a delimeter, so should not be used in a property name.
31+
**``ClassName``@``PropertyName``**: This is a special combination of Class Name and Property Name that can uniquely identify an entity. ‘@’ is a delimiter, so should not be used in a property name.
3232
33-
**PROP_NAME_1 .. PROP_NAME_N**: Arbitraty property names.
33+
**PROP_NAME_1 .. PROP_NAME_N**: Arbitrary property names.
3434
3535
**constraint_PROP_NAME_1**: A equality check against a unique property to ensure duplicates are not inserted.
3636

aperturedb/Connector.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -70,15 +70,15 @@ def valid(self) -> bool:
7070

7171
# This triggers refresh if the session is about to expire.
7272
if session_age > self.session_token_ttl - \
73-
int(os.getenv("SESSION_EXPIRTY_OFFSET_SEC", 10)):
73+
int(os.getenv("SESSION_EXPIRY_OFFSET_SEC", 10)):
7474
return False
7575

7676
return True
7777

7878

7979
class Connector(object):
8080
"""
81-
**Class to facilitate connections with an instance of aperturedb**
81+
**Class to facilitate connections with an instance of ApertureDB**
8282
8383
It lets the client execute any JSON query based on the [ApertureDB query language specification](/query_language/Overview/API%20Description)
8484
@@ -91,7 +91,7 @@ class Connector(object):
9191
bool (use_ssl): Use SSL to encrypt communication with the database.
9292
bool (use_keepalive): Set keepalive on the connection with the database.
9393
This has two benefits: It reduces the chance of disconnection for a long-running query,
94-
and it means that diconnections are detected sooner.
94+
and it means that disconnections are detected sooner.
9595
Turn this off to reduce traffic on high-cost network connections.
9696
Configuration (config): Configuration object to use for connection.
9797
"""

aperturedb/ConnectorRest.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@
4444

4545
class ConnectorRest(Connector):
4646
"""
47-
**Class to use aperturedb's REST interface**
47+
**Class to use ApertureDB's REST interface**
4848
4949
Args:
5050
str (host): Address of the host to connect to.
@@ -143,9 +143,9 @@ def _query(self, query, blob_array = [], try_resume=True):
143143

144144
if tries == self.config.retry_max_attempts:
145145
raise Exception(
146-
f"Could not query apertureDB {self.config} using REST.")
146+
f"Could not query ApertureDB {self.config} using REST.")
147147
return (self.last_response, response_blob_array)
148148

149149
def _connect(self):
150-
logger.info("Connecting to aperturedb using REST")
150+
logger.info("Connecting to ApertureDB using REST")
151151
self.connected = True

aperturedb/DaskManager.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -74,8 +74,8 @@ def process(df, host, port, use_ssl, session, connnector_type):
7474
count += 1
7575
metrics.times_arr.extend(loader.times_arr)
7676
metrics.error_counter += loader.error_counter
77-
metrics.suceeded_queries += loader.get_suceeded_queries()
78-
metrics.suceeded_commands += loader.get_suceeded_commands()
77+
metrics.succeeded_queries += loader.get_succeeded_queries()
78+
metrics.succeeded_commands += loader.get_succeeded_commands()
7979

8080
return metrics
8181

aperturedb/DescriptorDataCSV.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ class DescriptorDataCSV(CSVParser.CSVParser):
1818
**ApertureDB Descriptor Data.**
1919
2020
This class loads the Descriptor Data which is present in a CSV file,
21-
and converts it into a series of aperturedb queries.
21+
and converts it into a series of ApertureDB queries.
2222
2323
:::note Is backed by a CSV file with the following columns, and a NumPy array file "npz" for the descriptors:
2424
``filename``, ``index``, ``set``, ``label``, ``PROP_NAME_1``, ... ``PROP_NAME_N``, ``constraint_PROP_NAME_N``
@@ -30,11 +30,11 @@ class DescriptorDataCSV(CSVParser.CSVParser):
3030
3131
**set**: The search space to restrict the knn search queries to.
3232
33-
**label**: Arbitraty name given to the label associated with this descriptor.
33+
**label**: Arbitrary name given to the label associated with this descriptor.
3434
3535
**PROP_NAME_1 .. PROP_NAME_N**: Arbitrarily assigned properties to this descriptor.
3636
37-
**constraint_PROP_NAME_1**: A constraint to enusre uniqueness when inserting this descriptor.
37+
**constraint_PROP_NAME_1**: A constraint to ensure uniqueness when inserting this descriptor.
3838
3939
Example CSV file::
4040
@@ -56,7 +56,7 @@ class DescriptorDataCSV(CSVParser.CSVParser):
5656
5757
5858
:::info
59-
In the above example, the index uniqely identiifes the actual np array from the many arrays in the npz file
59+
In the above example, the index uniquely identifies the actual np array from the many arrays in the npz file
6060
which is same for line 1 and line 2. The UUID and constraint_UUID ensure that a Descriptor is inserted only once in the DB.
6161
6262
Association of an entity to a Descriptor can be specified by first ingesting other Objects, then Descriptors and finally by

0 commit comments

Comments
 (0)