Skip to main content
InterSystems Data Fabric Studio
AskMe (beta)

Release Notes (2.9)

This page describes the major changes in version 2.9, including bug fixes that may affect how you work.

Data Catalog

Data Schema Importer

It is now possible to import SQL views from the local InterSystems IRIS® data platform database. Just like tables, the view will need to be registered by the system’s security infrastructure and have the SELECT privilege granted on it for the user's current role in order for the user to see and import the view into the Data Catalog.

Streaming Data

This release supports streaming events from topics in a Kafka broker into an InterSystems IRIS table. Streaming can be kicked off for any number of topics on any number of brokers simultaneously, as well as started and stopped when desired. However, each stream is spun off as a background process, so streaming jobs should be started while keeping the system’s resource limits in mind. The solution can connect to brokers using any security protocol that brokers support: PLAINTEXT, SASL_PLAINTEXT, SASL_SSL, and SSL. In addition, the module offers four built-in deserializers for handling incoming events with different serialization formats:

  • Kafka String Deserializer (Latest)

  • Kafka Bytes Deserializer (Latest)

  • Kafka Connect JSON Deserializer v3.8.0

  • Confluent Kafka Avro Deserializer v7.7.0

Note the following important points:

  1. When configuring a topic to be streamed, the solution requires a JSON schema (https://json-schema.org/learn/getting-started-step-by-stepOpens in a new tab) representation of the structure of the messages stored in the topic. Alternatively, you can supply the subject name and version of the message schema if the schema registry is deployed in the Kafka cluster and has your schema stored.

  2. When streaming a topic for the first time, the ingestion of the topic’s messages will begin at the first offsets in the topic, that is, the first messages stored. When resuming streaming for that same topic later on, the last offset loaded is tracked and streaming will resume from that point.

  3. Due to current limitations of the underlying technology used to write Kafka events to disk at high speeds, JSON schemas with arrays of objects are currently not supported as message schemas. However, schemas with arrays of primitive data types (strings, integers, and so on) can be successfully processed.

  4. Message schemas with nested objects will be projected as a persistent class with serial objects to represent any nested objects in the schema. This allows the class to appear as a single flat table in SQL and is optimized for large-scale data processing.

Recipes

Recipe Reset

The recipe reset functionality has been improved to process requests asynchronously. This means that users will no longer experience the application hanging after they have triggered a recipe reset. If any errors occur while a recipe is being reset, users will be able to see the error and act upon it in the workflow inbox. Additionally, user will be able to see useful information in the UI such as:

  • New statuses in the business scheduler dashboard to show if the recipe is currently resetting or encountered an error while resetting

  • Statistics about the last time a recipe was reset

Promotion to File via SFTP

In addition to promoting to external JDBC and S3 sources, the product now also supports external promotion via SFTP. The source of the data to be promoted can be a schema definition from the Data Catalog or a direct query against the local database. When configuring a promotion to an SFTP source, there is a new tab labeled Target File Settings, where users can define metadata related to how the file will be written to the server, including the delimiter, line terminator, and file name pattern. When the %RUNDATE mask is including in the file name pattern, the current ISO timestamp will be injected into the file name when the file is created.

Business Scheduler

The Business Scheduler dashboard no longer displays separate status information for resources and tasks. The dashboard now has a single Status column.

The task details page now displays the Resource Status field only if the remote status is either Running or Error.

System Configuration

The solution now has a link to the documentation. Use the Documentation link in the menu under the user profile icon. By default, this link goes to the InterSystems documentation, but it can be configured to point to an alternative URL.

Configuration Management

It is now possible to unzip, edit, rezip, and upload files of configuration bundles. Do not change the name of the file. It is recommended to use the following command to rezip:

ditto -ck --rsrc --sequesterRsrc bundleFolder bundle.zip
FeedbackOpens in a new tab