You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to insert into two different iceberg tables using spark connect. But as soon as I access a table using get_schema,table or insert I won't be able to do it a second time.
First insert will work but not the second one. If first insert is commented, second one will work. Same thing will happen regardless of the get_schema,table or insert method.
What version of ibis are you using?
ibis-framework[pyspark]==10.0.0.dev490
What backend(s) are you using, if any?
PySpark
Relevant log output
pyspark.errors.exceptions.connect.AnalysisException: [SCHEMA_NOT_FOUND] The schema `my_catalog`. cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS.
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
What happened?
I am trying to insert into two different iceberg tables using spark connect. But as soon as I access a table using
get_schema
,table
orinsert
I won't be able to do it a second time.My code looks like this:
First insert will work but not the second one. If first insert is commented, second one will work. Same thing will happen regardless of the
get_schema
,table
orinsert
method.What version of ibis are you using?
ibis-framework[pyspark]==10.0.0.dev490
What backend(s) are you using, if any?
PySpark
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: