PubSub+ Event Portal Overview

PubSub+ Event Portal is an event management tool—presented through an accessible Web user interface (UI)—that enables you to discover, design, visualize, share, and manage various aspects of your event-driven architecture (EDA). On this page, we will discuss the fundamental elements of the Event Portal and provide an overview of the tools. We will also discuss some of the underlying features such as runtime discovery of EDAs, support for Kafka-native objects, event sharing, version control, REST API, AsyncAPI, and other essential features.

To get started, let's first understand the Foundational Elements of the Event Portal.

Foundational Elements of the Event Portal

When designing an event-driven architecture (EDA), it is essential to model the enterprise or interworking systems as a whole, which means dividing it into smaller, more manageable pieces. Once broken down into what the Event Portal defines as an Application Domain, within each Application Domain, you can create a set of event-driven entities or objects (schemas, events and applications) which represent the runtime interactions.

The following are the four foundational elements of the Event Portal:

Schemas, Events, and Applications are often referred to as objects in the Event Portal.

Application Domain

An application domain represents a namespace where applications, events, and schemas can live. Within this namespace, you can create a suite of applications, events and schemas that are independent of other application domains. This provides a way to create independent event-driven architectures for different products. You can also share events and schemas across application domains by creating the event or schema with the shared option selected.

Furthermore, application domains define a topic domain. For all the events in the application domain, the topics defined for these events should be named with the topic domain at the beginning of the topic. For example, if a topic domain is defined as solace/weather, then an event in that application domain may be named solace/weather/blizzard/started.

Schema

In simple terms, a schema represents the contract to describe the payload of an event. Producers and consumers of an event can trust that the event's payload matches the schema definition assigned to that event. Schemas define a type of payload through JSON, AVRO, XML, Binary, or Text. JSON, AVRO, and XML schemas have content that describes each property of the schema. The content is either in JSON or AVRO Schema format, or XSD/DTD format.

Furthermore, you can search and view JSON schemas in a more human-readable format along with the actual schema source. This makes it easy for users to easily read, browse, and understand the schema contents without going through the source.

Event

The event represents a business moment or an action that can be communicated with zero or more interested applications. The event is where you define metadata that describes and categorizes the event. An event is produced on a specific topic that must be defined for that event. From a modelling perspective, events reference payload schemas, and events are referenced by applications, thus forming the glue that binds applications together.

It is important to note that an event (or an event type), as referred to in the Event Portal is different from an event instance. For more information on this topic, refer to Event Type vs Event Instance.

Application

An application represents a piece of software that produces and consumes events. Applications connect to the event broker in an event-driven architecture and communicate with other applications via events. A single application represents a class of applications that are running the same code base; therefore, there is no need to create multiple applications for each instance in a cluster.

When creating an application, you can set an attribute called Application Class to specify a Kafka Connector, Kafka Application, or leave it as Unspecified. If you specify Kafka Connector or Kafka Application, you can add additional information.

Kafka Connector

A connector is used in Kafka for connecting Kafka brokers with external systems to stream data into or out of Apache Kafka. In the Event Portal, a Kafka Connector is an application class you select to configure associated published and/or subscribed events and a set of Kafka-native attributes as described below.

  • Connector Type: Set either a source or a sink connector.
  • Connector Class:The main java class, an entry point to launch the connector. By looking at the connector class's full name, you can derive the company/author of the given connector as well as look up the implementation details by reviewing the source code.
  • Cluster ID: The connector's cluster ID. In most cases, this will be a dedicated cluster specifically for connect capabilities.
  • Maximum Task: Configuration property that describes several concurrent tasks that will be launched to complete the ETL job on time.

To learn more about Kafka connector, refer to the discussion on connectors in Kafka's documentation.

Kafka Application

You can select a Kafka Application as an application type to add consumer groups that you can associate event subscriptions with. Refer to Consumer Groups to learn how it's supported in the Event Portal.

Event Portal Tools

Use these tools to create, design, manage, discover, visualize, and share all events within your ecosystem.

Discovery: Discover and visualize events your event driven architecture from your event brokers. Initial support is for discovery and the import of events, schemas, and application interactions from Apache Kafka, Confluent, and Amazon MSK. Support for additional brokers types will follow.

Designer: Design and view all aspects of your event-driven architecture.

Catalog: Browse for events, schemas, and applications defined in your environment using a searchable, sortable, and filterable interface.


In the subsequent topics, we will discuss some of the additional concepts and features of the Event Portal that are important to understand.

Associating Objects to Build Relationships

A key benefit of using Event Portal is its ability to track the relationships that exist in an extremely decoupled event-driven architecture (EDA). It enables the reuse of schemas and events, and graphically presents the relationships that exist between applications and events. For example, a schema may be used by many different events, and the business object changes through a transaction. Multiple producers may produce an event, and an application may consume and/or produce many events. Finally, events may be shared across various application domains. Thus, the Event Portal helps you manage and understand the EDA regardless of which event broker is being used at runtime.

Using Event Portal in Multiple Operational Environments

As an Event Portal user, you can model your event-driven architecture (EDA) in different operational environments. For instance, many organizations will have their applications running in at least one of these environments: development, staging, and production. Some enterprises also operate additional environments such as developer, quality assurance, user acceptance testing (UAT), or multiple production environments. The goal, therefore, is to keep these operational environments separate in terms of software running and user access permissions.

Event Portal allows for environment separation, in the same way how the event broker services are separated within enterprises. This means you can create a different cloud console or Event Portal account for each environment within enterprise. You can then create and grant different user access permissions for each of these environments, and use that to model your EDAs as they progress from one environment to the next.

EDAs, including all associated applications, events, and payload schemas, can be promoted from one environment to the next by exporting entire Application Domains from the previous environment and importing it into the next, using the Import/Export Application Domain functionality in Event Portal.

Event Type vs Event Instance

An event type (or event as it is referred to in Event Portal) represents a class of events produced in an event-driven architecture. The event type is made up of its topic and schema that represents the allowed payload for the event.

An event instance is a specific instance of an event that is produced. An event instance has an event type. It conforms to the schema of the event type and is produced on a topic defined for the event type. Over the lifecycle of an application, many event instances are produced and consumed.

Shared Events and Schemas

By default, you can only reference events and schemas within their own application domain. For another application domain to reference your events and schemas, the event or schema must be marked as shared. For example, imagine a situation where there are two application domains modelled in Event Portal: the Weather Events and the Traffic Events. Say, an event is created in the Weather Events application domain called the Blizzard. Since the Traffic Events application domain should be able to consume the Blizzard event, the creator of the Blizzard marks the event as shared. Now when an application is created in the Traffic Event application domain, it can choose to publish or subscribe to the Blizzard event. If it were not marked as shared, it would not be allowed to reference by applications in the Traffic Events application domain.

Event Discovery Agent

You can use the Event Discovery Agent to run a Discovery scan against your event broker. The agent will connect to your event broker to scan your event-driven architecture using a specified set of topic subscriptions. The gathered data is generated as a JSON file (Discovery file), which you can upload into Event Portal.

The following event brokers are currently supported: Apache Kafka, Confluent, Amazon MSK, and PubSub+ event brokers (Preview) .

Runtime Discovery for PubSub+ event broker is available in a Preview release stage. At present, you can only run a scan and generate a Discovery file. Uploading it to the Event Portal's Discovery Staging area and committing the Discovery data to Designer/Catalog is not yet supported.

You can install the agent locally and use it to scan your EDA and generate a Discovery file. To install the agent or learn how to run a Discovery scan, refer to Installing the Offline Discovery Agent .

Information Discovery Agent Captures and Uploads

The information the agent gathers during a Discovery scan depends on the event broker type.

PubSub+ Event Broker

The following data/metadata is returned in the Discovery scans:

Information Gathered By Agent Description
Broker

The event broker information taken from the input data specified by the user, which is also uploaded in the Discovery file.

"broker": {
"	brokerType": "solace",
	"hostname": "mySolaceHost.solace.com"
	"additionalAttributes": {
"	vpnName": "myVpnName" 
	}
}
Clients

An active or inactive messaging client in the event broker. A client has a top level attribute, which is used to indicate of the client is a consumer or his client type implies message flow direction in the context of relationships with channel objects.

Te following client type information is captured:

  • clientType: Client Application
  • clientUserName: myClientUsername
  • clientProfleName: myClientProflleName
  • aclProfileName: myAclProfileName
Channels

A channel represents the event broker entities that are responsible for the transmission and storage of messages.

The channel types include:

  • Topics
  • Queues
  • Topic Endpoints
Subscriptions

Subscriptions represents a potentially unbounded “channel space” described by a set of criteria (topic strings with wildcards, routing keys, etc.)

Subscription types

  • client to topic subscription
  • queue to topic subscription
Schema

The schema of the message payload (and or key). Can be inferred from a message or discovered from a schema registry.

The following attributes are included:

  • content—the schema content
  • schemaType—type of schema.
  • hints—a pre-defined set of hints to help upper-layer applications process the schema.
  • primitive—true if the schema is a primitive type
Object Relationships

The object relations represent relationship between two or more to-level objects.

The following relationships are captured:

  • clientToSubscriptionRelationship
  • channelToSubscriptionRelationship
  • clientToChannelRelationship
  • channelToChannelRelationship

Kafka

The following data/metadata is returned in the Discovery scans:

Information Gathered By Agent Description
Topics A list of topic names.
Connectors

A list of connectors with the following configuration data:

  • connector class
  • connector type
  • maximum thread allocation
Consumer Groups

A list of consumer groups with a flag indicating if they are simple consumers.

Schemas A list of schemas. In the case of a string schema (non-json format), the contents of the message will be included.
Object Relationships

The object relations represent relationship between two or more to-level objects.

The following relationships are captured:

  • connectorToTopicAssociations
  • consumerGroupToTopicAssociations
  • topicToSchemaAssociations

Workspace

In the Discovery, Application Domains are grouped by a property called Workspace. After adding a Discovery file, it must be assigned a Workspace before being imported into the Event Portal's data model. When performing an audit, the Workspace must be specified, and only application domains within that specified group are considered for audit comparison with the Discovery file.

Take a look at Managing Workspace to see how you can use Workspace to compare your discovered event-drive architecture (EDA) to a specific Application Domain or set of Application Domains in a Workspace.

Consumer Groups

Event Portal supports the concept of Kafka's consumer groups. A consumer group is used by Kafka to group consumers into a logical subscriber for a topic. In Event Portal, you can model consumer groups in Designer. This enables the Event Portal's runtime discovery to associate a discovered consumer group to an existing application.

Kafka consumers that belong to the same consumer group share a group ID. The consumers in a group divide the topic partitions, as fairly as possible, so that each consumer consumes only a single partition from the group.

To learn how you can set up consumer groups in the Event Portal, see Configuring a Consumer Group.

Visit the Kafka's documentation to learn more about consumer groups.

Primitive Types

Some message payloads consist of primitive types such as String and Numbers, rather than more complex structures (JSON or XML). Event Portal supports primitive event types so that you can model your events (for example, Kafka topics) with payloads that are simple primitives. Primitive types for message payloads are automatically detected in the Discovery and can be committed or imported into Designer. Likewise, you can also create and edit an event to use a primitive type in the Designer. Using AsyncAPI, you can generate an AsyncAPI specification for the application associated with an event with primitive payloads.

The following primitive types are currently supported in Event Portal:

  • Avro: Null, Boolean, Int, Long, Float, Double, Byte, String
  • JSON: Null, Boolean, Number, String

To learn how to use a Primitive Type when designing an event, see Creating an Event.

Multiple Record Types

It is common for Kafka-based applications to publish events to the event broker on the same topic that does not share the same schema. In Event Portal, you can discover, re-discover, and stage different topics with more than one record type. In other words, topics that have multiple records with different payload schemas are discovered in Event Portal Discovery. The payload schema could be JSON, Avro, XML, text, or Binary. Discovered events for the same topic are uniquely named to differentiate each event before committing them into the Designer as events. Once committed, the topic to schema association occurs in the backend, which can then be visualized in Designer.

To learn how to discover and commit your EDA into Event Portal, refer to Staging and Committing Discovery Scans into Event Portal.

Tags

Applications, events and schemas all have the option to have tags associated with them. Searching for a tag name in Catalog will find all the objects associated with that tag.

Tags can be a great way to share information with other members of your organization by grouping sets of objects together. Here are some examples of how you can use tags:

  • Create a tag called Java and set it against all of your Java-based applications
  • Create a tag called User and tag all of your schemas related to users
  • Create a tag called Sign in Flow and tag the series of events and applications that are involved in the sign in flow for your application

In the future release of the Event Portal, Designer will have the ability to filter applications and events based on the tags applied to them. This will give another layer of customization to visualize your event-driven architectures.

Revision History

Revision history track the changes to objects in Event Portal. Any time an application, event or schema is updated, it creates a new revision. For schemas, you can perform additional tasks such as creating multiple versions of a particular schema and advanced version control management. Refer to Schema Versions and Version Control for more information.

You can restore an old revision of an object at any point. The act of restoring an object to a previous revision creates a new revision of that object. For example, an object whose current revision is at Rev 2, when restored to Rev 1, will result in it having three revisions. The new Rev 3 will be the same as Rev 1 since it was restored from it.

There are few exceptions that you should consider when using the revision history of an object:

  • Owner and tag information of an object is not included in the revision history. Changing the owners or tags of an object will not create a new revision of the object.
  • Restoring to a revision of an object whose associations have changed will not change the associated objects. For example, imagine an event and schema that have both undergone ten revisions. At the time of the second revision of the event, the schema was on its first revision. Reverting the event to its second revision will not return the schema to its first revision, it will continue to be its tenth revision.
  • Application domains do not track revision history. Instead, if you need to revert an application domain and all of its contained objects to a point in time, it is recommended to use the import/export functionality.

For tutorials, refer to Managing Object Revisions.

Schema Versions and Version Control

You can create schemas with our without versions. When using schema versions, you can create and store multiple versions of the same schema, which can be used in your EDAs that require two or more versions of a particular schema. Revision History is support with schema version control, such as saving revisions of each schema, viewing changes in the revisions, and reverting to an older revision.

For additional information and tutorials, see Creating a Schema and Creating Multiple Versions of a Schema.

Archive Objects and Revisions

When you delete an object, it will be archived for ninety days, along with its associated revisions. You can restore the deleted object and its associated revisions anytime within the ninety days.

To learn more, refer to Deleting an Object.

AsyncAPI

AsyncAPI is an open-source initiative that seeks to improve the current state of Event-Driven Architectures (EDA). Using AsyncAPI allows development teams to create applications that communicate asynchronously through events more easily. The core of AsyncAPI is a specification that describes how to create a document that explains the inner workings of an event-driven API.

You can use an AsyncAPI specification document for many functions, such as:

  • Generating documentation
  • Generating code
  • Validating events received by your application
  • Applying API management policies

Learn more about AsyncAPI on their website at https://www.asyncapi.com.

AsyncAPI and the Event Portal

Event Portal natively supports the AsyncAPI 2.0.0 specification and applications can be exported into an AsyncAPI document. You can export applications in JSON and YAML; the two supported formats of AsyncAPI.

To learn how to generate an AsyncAPI document for an application, refer to Generating an AsyncAPI.

REST API

Event Portal provides a RESTful API that you can use to manage your data in the PubSub+ Cloud. Use the REST API to integrate other applications, systems, or client applications with Event Portal, and model or retrieve your event-driven architectures from your own client applications.

For more information, refer to the Event Portal REST API documentation.

Related Topics