Skip to content
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions bigframes/session/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -976,8 +976,7 @@ def read_pandas(
quota and your data cannot be embedded in SQL due to size or
data type limitations.
* "bigquery_write":
[Preview] Use the BigQuery Storage Write API. This feature
is in public preview.
Use the BigQuery Storage Write API.
Returns:
An equivalent bigframes.pandas.(DataFrame/Series/Index) object

Expand Down Expand Up @@ -1026,7 +1025,7 @@ def _read_pandas(
mem_usage = pandas_dataframe.memory_usage(deep=True).sum()
if write_engine == "default":
write_engine = (
"bigquery_load"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit worried about https://docs.cloud.google.com/bigquery/docs/sandbox users. Maybe we could provide a global option for the default so that such users can switch back to load jobs (accepting the limitations for data types).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added config option in new version

"bigquery_write"
if mem_usage > bigframes.constants.MAX_INLINE_BYTES
else "bigquery_inline"
)
Expand Down
2 changes: 1 addition & 1 deletion bigframes/session/bq_caching_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -594,7 +594,7 @@ def _upload_local_data(self, local_table: local_data.ManagedArrowTable):
# Might be better as a queue and a worker thread
with self._upload_lock:
if local_table not in self.cache._uploaded_local_data:
uploaded = self.loader.load_data(
uploaded = self.loader.write_data(
local_table, bigframes.core.guid.generate_guid()
)
self.cache.cache_remote_replacement(local_table, uploaded)
Expand Down