Skip to content

Commit 120bb68

Browse files
committed
Add catalog_and_metadata.ipynb tutorial!
1 parent 9ce2aec commit 120bb68

File tree

4 files changed

+1027
-3
lines changed

4 files changed

+1027
-3
lines changed

awswrangler/aurora.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -173,8 +173,6 @@ def load_table(dataframe: pd.DataFrame,
173173
region=region)
174174
logger.debug(sql)
175175
cursor.execute(sql)
176-
connection.commit()
177-
logger.debug("Load committed.")
178176

179177
if "mysql" in engine.lower():
180178
with connection.cursor() as cursor:
@@ -188,6 +186,9 @@ def load_table(dataframe: pd.DataFrame,
188186
raise AuroraLoadError(
189187
f"Missing files to load. {num_files_loaded} files counted. {num_files + 1} expected.")
190188

189+
connection.commit()
190+
logger.debug("Load committed.")
191+
191192
@staticmethod
192193
def _parse_path(path):
193194
path2 = path.replace("s3://", "")
350 KB
Loading

tutorials/athena_nested.ipynb

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,11 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"# Athena with nested data types"
7+
"# Athena with nested data types\n",
8+
"\n",
9+
"[AWS Data Wrangler](https://github.com/awslabs/aws-data-wrangler) inherits some deep nested types limitations of [Apache Arrow](https://arrow.apache.org/). A good alternative to overcame it is counting on Athena's help to unnest this complex data types before load it in some Pandas Dataframe.\n",
10+
"\n",
11+
"This tutotial will expose some useful features for this purpose."
812
]
913
},
1014
{

0 commit comments

Comments
 (0)