Skip to content

Commit

Permalink
Merge pull request #12483 from MicrosoftDocs/learn-build-service-prod…
Browse files Browse the repository at this point in the history
…bot/docutune-autopr-20240819-050636-6024026-ignore-build

[DocuTune-Remediation] - Scheduled execution to fix known issues in Azure Architecture Center articles (part 3)
  • Loading branch information
v-dirichards authored Aug 19, 2024
2 parents 9750a05 + 6820394 commit 96e5acf
Show file tree
Hide file tree
Showing 6 changed files with 28 additions and 29 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ See the following Delphix resources:

- [Get set up with Delphix CC](https://maskingdocs.delphix.com/)
- Learn about [using Delphix CC to find where sensitive data resides](https://maskingdocs.delphix.com/Identifying_Sensitive_Data/Discovering_Your_Sensitive_Data_-_Intro/)
- See [customers using Delphix on Azure](https://www.delphix.com/solutions/cloud/azure)
- See [Customers using Delphix on Azure](https://www.delphix.com/solutions/cloud/azure)

Learn more about the key Azure services in this solution:

Expand Down
7 changes: 3 additions & 4 deletions docs/databases/guide/transactional-outbox-cosmos-content.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Implementing reliable messaging in distributed systems can be challenging. This article describes how to use the Transactional Outbox pattern for reliable messaging and guaranteed delivery of events, an important part of supporting [idempotent message processing](/azure/architecture/reference-architectures/containers/aks-mission-critical/mission-critical-data-platform#idempotent-message-processing). To accomplish this, you'll use Azure Cosmos DB transactional batches and change feed in combination with Azure Service Bus.

## Overview
## Overview

Microservice architectures are becoming increasingly popular and show promise in solving problems like scalability, maintainability, and agility, especially in large applications. But this architectural pattern also introduces challenges when it comes to data handling. In distributed applications, each service independently maintains the data it needs to operate in a dedicated service-owned datastore. To support such a scenario, you typically use a messaging solution like RabbitMQ, Kafka, or Azure Service Bus that distributes data (events) from one service via a messaging bus to other services of the application. Internal or external consumers can then subscribe to those messages and get notified of changes as soon as data is manipulated.

Expand Down Expand Up @@ -34,7 +34,6 @@ Whatever the error is, the result is that the `OrderCreated` event can't be publ

:::image source="_images/event-handling-before-pattern.png" alt-text="Diagram that shows event handling without the Transactional Outbox pattern.":::


## Solution

There's a well-known pattern called *Transactional Outbox* that can help you avoid these situations. It ensures events are saved in a datastore (typically in an Outbox table in your database) before they're ultimately pushed to a message broker. If the business object and the corresponding events are saved within the same database transaction, it's guaranteed that no data will be lost. Everything will be committed, or everything will roll back if there's an error. To eventually publish the event, a different service or worker process queries the Outbox table for unhandled entries, publishes the events, and marks them as processed. This pattern ensures events won't be lost after a business object is created or modified.
Expand Down Expand Up @@ -300,7 +299,7 @@ private async Task<List<IDataObject<Entity>>> SaveInTransactionalBatchAsync(
}

// Return copy of current list as result.
var result = new List<IDataObject<Entity>>(DataObjects);
var result = new List<IDataObject<Entity>>(DataObjects);

// Work has been successfully done. Reset DataObjects list.
DataObjects.Clear();
Expand Down Expand Up @@ -560,7 +559,7 @@ You can find the source code, deployment files, and instructions to test this sc

*This article is maintained by Microsoft. It was originally written by the following contributors.*

Principal author:
Principal author:

- [Christian Dennig](https://www.linkedin.com/in/christian-dennig/) | Senior Software Engineer

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[!INCLUDE [header_file](../../../includes/sol-idea-header.md)]

This article presents a solution for automating data analysis and visualization using artificial intelligence (AI). Core components in the solution are Azure Functions, Azure Cognitive Services, and Azure Database for MySQL.
This article presents a solution for automating data analysis and visualization using artificial intelligence (AI). Core components in the solution are Azure Functions, Azure AI services, and Azure Database for MySQL.

## Architecture

Expand All @@ -10,11 +10,11 @@ This article presents a solution for automating data analysis and visualization

### Dataflow

1. An Azure Function activity allows you to trigger an Azure Functions App in the Azure Data Factory pipeline. You create a linked service connection and use the linked service with an activity to specify the Azure Function you want to execute.
1. An Azure function activity allows you to trigger an Azure Functions App in the Azure Data Factory pipeline. You create a linked service connection and use the linked service with an activity to specify the Azure function you want to execute.
1. Data comes from various sources such as Azure Storage or Azure Event Hubs for high-volume data. When the pipeline receives new data, it triggers the Azure Functions App.
1. The Azure Functions App calls the Cognitive Services API to analyze the data.
1. The Cognitive Services API returns the results of the analysis in JSON format to the Azure Functions App.
1. The Azure Functions App stores the data and results from the Cognitive Services API in Azure Database for MySQL.
1. The Azure Functions App calls the Azure AI services API to analyze the data.
1. The Azure AI services API returns the results of the analysis in JSON format to the Azure Functions App.
1. The Azure Functions App stores the data and results from the Azure AI services API in Azure Database for MySQL.
1. Azure Machine Learning uses custom machine learning algorithms to provide further insights into the data.
1. The MySQL database connector for Power BI provides options for data visualization and analysis in Power BI or a custom web application.

Expand Down Expand Up @@ -43,7 +43,7 @@ This article presents a solution for automating data analysis and visualization

The automated pipeline uses the following services to analyze the data:

- Cognitive Services uses AI for question answering, sentiment analysis, and text translation.
- Azure AI services uses AI for question answering, sentiment analysis, and text translation.
- Azure Machine Learning supplies machine-learning tools for predictive analytics.

The solution automates the delivery of the data analysis. A connector links Azure Database for MySQL with visualization tools like Power BI.
Expand All @@ -63,9 +63,9 @@ This solution is ideal for organizations that run predictive analytics on data f

## Considerations

- For most features, the Cognitive Service for Language API has a maximum size of 5120 characters for a single document. For all features, the maximum request size is 1 MB. For more information about data and rate limits, see [Service limits for Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/data-limits#maximum-characters-per-document).
- For most features, the Azure AI Language API has a maximum size of 5120 characters for a single document. For all features, the maximum request size is 1 MB. For more information about data and rate limits, see [Service limits for Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/data-limits#maximum-characters-per-document).

- Previous versions of this solution used the Cognitive Services Text Analytics API. Azure Cognitive Service for Language now unifies three individual language services in Cognitive Services: Text Analytics, QnA Maker, and Language Understanding (LUIS). You can easily migrate from the Text Analytics API to the Cognitive Service for Language API. For instructions, see [Migrate to the latest version of Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/migrate-language-service-latest).
- Previous versions of this solution used the Azure AI services Text Analytics API. Azure AI Language now unifies three individual language services in Azure AI services: Text Analytics, QnA Maker, and Language Understanding (LUIS). You can easily migrate from the Text Analytics API to the Azure AI Language API. For instructions, see [Migrate to the latest version of Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/migrate-language-service-latest).

## Contributors

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[!INCLUDE [header_file](../../../includes/sol-idea-header.md)]

This article presents a solution for automating data analysis and visualization using artificial intelligence (AI). Core components in the solution are Azure Functions, Azure Cognitive Services, and Azure Database for PostgreSQL.
This article presents a solution for automating data analysis and visualization using artificial intelligence (AI). Core components in the solution are Azure Functions, Azure AI services, and Azure Database for PostgreSQL.

## Architecture

Expand All @@ -10,11 +10,11 @@ This article presents a solution for automating data analysis and visualization

### Dataflow

1. An Azure Function activity allows you to trigger an Azure Functions App in the Azure Data Factory pipeline. You create a linked service connection and use the linked service with an activity to specify the Azure Function you want to execute.
1. An Azure function activity allows you to trigger an Azure Functions App in the Azure Data Factory pipeline. You create a linked service connection and use the linked service with an activity to specify the Azure function you want to execute.
1. Data comes from multiple sources including Azure Storage and Azure Event Hubs for high-volume data. When the pipeline receives new data, it triggers the Azure Functions App.
1. The Azure Functions App calls the Cognitive Services API to analyze the data.
1. The Cognitive Services API returns the results of the analysis in JSON format to the Azure Functions App.
1. The Azure Functions App stores the data and results from the Cognitive Services API in Azure Database for PostgreSQL.
1. The Azure Functions App calls the Azure AI services API to analyze the data.
1. The Azure AI services API returns the results of the analysis in JSON format to the Azure Functions App.
1. The Azure Functions App stores the data and results from the Azure AI services API in Azure Database for PostgreSQL.
1. Azure Machine Learning uses custom machine learning algorithms to provide further insights into the data.
- If you're approaching the machine learning step with a no-code perspective, you can implement further text analytics operations on the data, like feature hashing, Word2Vector, and n-gram extraction.
- If you prefer a code-first approach, you can run an open-source natural language processing (NLP) model as an experiment in Machine Learning studio.
Expand All @@ -34,7 +34,7 @@ This article presents a solution for automating data analysis and visualization

The automated pipeline uses the following services to analyze the data:

- Cognitive Services uses AI for question answering, sentiment analysis, and text translation.
- Azure AI services uses AI for question answering, sentiment analysis, and text translation.
- Azure Machine Learning supplies machine-learning tools for predictive analytics.

To store data and results, the solution uses Azure Database for PostgreSQL. The PostgreSQL database supports unstructured data, parallel queries, and declarative partitioning. This support makes Azure Database for PostgreSQL an effective choice for highly data-intensive AI and machine learning tasks.
Expand All @@ -60,15 +60,15 @@ Azure Database for PostgreSQL is a cloud-based solution. As a result, this solut

These considerations implement the pillars of the Azure Well-Architected Framework, which is a set of guiding tenets that can be used to improve the quality of a workload. For more information, see [Microsoft Azure Well-Architected Framework](/azure/well-architected/).

- For most features, the Cognitive Service for Language API has a maximum size of 5120 characters for a single document. For all features, the maximum request size is 1 MB. For more information about data and rate limits, see [Service limits for Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/data-limits#maximum-characters-per-document).
- For most features, the Azure AI Language API has a maximum size of 5120 characters for a single document. For all features, the maximum request size is 1 MB. For more information about data and rate limits, see [Service limits for Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/data-limits#maximum-characters-per-document).

- In Azure Database for PostgreSQL, your ingress volume and velocity determine your selection of service and deployment mode. Two services are available:
- Azure Database for PostgreSQL
- Azure Cosmos DB for PostgreSQL, which was formerly known as Hyperscale (Citus) mode

If you mine large workloads of customer opinions and reviews, use Azure Cosmos DB for PostgreSQL. Within Azure Database for PostgreSQL, two modes are available: single server and flexible server. To understand when to use each deployment mode, see [What is Azure Database for PostgreSQL?](/training/modules/intro-to-postgres/2-what-is-azure-database-postgresql).

- Previous versions of this solution used the Cognitive Services Text Analytics API. Azure Cognitive Service for Language now unifies three individual language services in Cognitive Services: Text Analytics, QnA Maker, and Language Understanding (LUIS). You can easily migrate from the Text Analytics API to the Cognitive Service for Language API. For instructions, see [Migrate to the latest version of Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/migrate-language-service-latest).
- Previous versions of this solution used the Azure AI services Text Analytics API. Azure AI Language now unifies three individual language services in Azure AI services: Text Analytics, QnA Maker, and Language Understanding (LUIS). You can easily migrate from the Text Analytics API to the Azure AI Language API. For instructions, see [Migrate to the latest version of Azure Cognitive Service for Language](/azure/cognitive-services/language-service/concepts/migrate-language-service-latest).

### Security

Expand All @@ -86,7 +86,7 @@ You can also automate your machine learning lifecycle by using [Azure Pipelines]

Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies. For more information, see [Overview of the cost optimization pillar](/azure/architecture/framework/cost/overview).

Cognitive Service for Language offers various pricing tiers. The number of text records that you process affects your cost. For more information, see [Cognitive Service for Language pricing](https://azure.microsoft.com/pricing/details/cognitive-services/language-service).
Azure AI Language offers various pricing tiers. The number of text records that you process affects your cost. For more information, see [Cognitive Service for Language pricing](https://azure.microsoft.com/pricing/details/cognitive-services/language-service).

## Next steps

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ These considerations implement the pillars of the Azure Well-Architected Framewo

- When you implement and maintain this solution, you incur extra costs.
- Using the change feed for replication requires less code maintenance than doing the replication in the core application.
- You need to migrate existing data. The migration process requires ad-hoc scripts or routines to copy old data to storage accounts. When you migrate the data, make sure that you use time stamps and copy flags to track migration progress.
- You need to migrate existing data. The migration process requires ad hoc scripts or routines to copy old data to storage accounts. When you migrate the data, make sure that you use time stamps and copy flags to track migration progress.
- To avoid deleting entries from the Azure Table secondary storage, ignore delete feeds that are generated when your functions delete entries from Azure Cosmos DB.

## Contributors
Expand Down
Loading

0 comments on commit 96e5acf

Please sign in to comment.