06.15.2023 Release Notes
over 1 year ago
These are the release notes for June 15, 2023
✨ ENHANCEMENTS ✨
- All environments (Gen1/Gen2)
- Several enhancements are now available for Read Connectors for cloud blob storage services (Amazon S3, Azure Data Lake Storage, Google Cloud Storage).
- ZIP files are now supported! You can also use regular expressions to filter by file name within ZIP files, when there is more than one file within a ZIP file.
- For Custom Python Parsers, you can add an optional infer_schema() method, which gives you control over the types returned by the parser during schema generation.
- Performance of schema generation in the CData Read Connector has been optimized.
- The Apache Iceberg library has been upgraded to 1.3.0.
- Several enhancements are now available for Read Connectors for cloud blob storage services (Amazon S3, Azure Data Lake Storage, Google Cloud Storage).
- Gen2 environments
- Several improvements have been implemented that help address cases where a component is "stuck in ready".
- Show Component metadata operation state in the User Interface (UI).
- Ascend was performing backend metadata operations that were not visible to users in the UI and made components look like they were "stuck in ready".
- This enhancement adds new states to the UI (including "Waiting to Update", "Updating Metadata", "Waiting to Sweep" and "Sweeping"). These are now documented in our component state documentation.
- Internal platform change to separate processing job tasks from metadata operations.
- Show Component metadata operation state in the User Interface (UI).
- Snowflake Data Plane
- The Warehouse Override option "Metadata Warehouse" has been renamed to "Interactive Warehouse" in the Snowflake Data Plane configuration.
- This is to give customers the option to separate some types of query traffic from data pipeline processing query traffic.
- Query traffic that is routed to the Interactive Warehouse now includes:
- Queries that enable the records preview tab in components;
- Data Plane usage metric gathering queries run in the backend;
- Snowflake schema inference jobs.
- User queries run through the Ascend Query tab are still sent to the "Default" warehouse.
- Users that actively use the existing ingest_warehouse and metadata_warehouse settings for Data Share components (via the Ascend Python SDK) should change their code to use the new warehouse_usage_overrides field (see docs).
- The Warehouse Override option "Metadata Warehouse" has been renamed to "Interactive Warehouse" in the Snowflake Data Plane configuration.
- Several improvements have been implemented that help address cases where a component is "stuck in ready".
🔧 BUGFIXES 🔧
- All environments (Gen1/Gen2)
- Use binary read/write functions to avoid changing newlines on user code during SDK download/apply on Windows.
- Fix was released in Ascend SDK v0.2.59.
- Fix a bug where some event fields used in a Webhook Notification template were not being populated.
- Event fields now being populated include: event.component.name, event.component.type, event.component.uri.
- A condition that would occasionally result in missing Billing Usage information in the Observability tab has been handled.
- Use binary read/write functions to avoid changing newlines on user code during SDK download/apply on Windows.
- Gen2 environments
- Fix a bug where using the Ascend Python SDK to download a Data Share definition resulted in an empty list of Data Services with access to the share.
- This triggered a bug where the "hidden_from_host_data_service" attribute was being inadvertently set to True. This caused the Data Share to be hidden from the data service it was published in.
- Fix was released in Ascend SDK v0.2.60.
- Snowflake Data Plane
- Some cases of components stuck in Running, Listing and Ready states have been fixed by improving Data Plane Snowflake connection management and more eagerly closing connections.
- Fix a bug where using the Ascend Python SDK to download a Data Share definition resulted in an empty list of Data Services with access to the share.