Skip to Main Content
Integration


This is an IBM Automation portal for Integration products. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas
  1. Post an idea.

  2. Get feedback from the IBM team and other customers to refine your idea.

  3. Follow the idea through the IBM Ideas process.


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.


Status Future consideration
Workspace App Connect
Created by Guest
Created on Mar 2, 2022

Support Schema registry to integrate with Kafka

When producing and consuming messages to/from Kafka we need to support schema registry as all production deployment use registry. Event Streams is supporting apicur.io and ACE should be able to do the same. As part of the producer, when it uses a kafka user with schema registry ACL to write new schema, then it becomes transparent.

Idea priority Urgent
  • Guest
    Reply
    |
    Jan 31, 2024

    Are we considering this feature to make the Kafka Producer a schema-aware client?

    This benefits so many ACE users and adds value to ACE ecosystem.

  • Admin
    Ben Thompson
    Reply
    |
    Mar 23, 2022

    RFE Review. Thank you for taking the time to submit this idea for enhancement which we agree would benefit the product. In the past we have extended the Kafka message flow nodes to assist with use cases involving schema registries - the provision of options to set and retrieve Kafka custom header properties helps flow developers deal with the association between a Kafka message and the schema which describes its format:


    https://www.ibm.com/docs/en/app-connect/12.0?topic=enterprise-setting-retrieving-kafka-custom-header-properties


    One popular serialization/deserialization format is Avro, and being able to interrogate Kafka custom header properties in a JavaCompute node can help a user write code to select a relevant schema when interpreting the content of a Kafka message.


    Regarding the specifics of the idea, we like the concept of tying together the process of reading/writing the message with the process of injecting/retrieving the schema from the registry, but also note that in many circumstances flow developers might want a built-in function for doing these tasks but not necessarily want to execute the injection/retrieval of the schema into the registry at the same time (potentially this could be controlled by separate flow logic, or only invoked for certain messages depending on their characteristics, or perhaps a set of commonly used schemas cached in memory). Any thoughts on these aspects would be appreciated as it would help us devise the best externals for this feature (eg separate node for the purpose of injection/retrieval, or extra properties on the existing KafkaConsumer / KafkaRead / KafkaProducer?)

    Status of the Idea is updated to Future Consideration.