Building an AI News Feed with Azure Search and the Bot Builder Framework Part 2

Building an AI News Feed with Azure Search and the Bot Builder Framework Part 2

In our last article we introduced our first look at AI. Today we are going to dive deeper into the project and start building out a serverless API for managing internal and external news feeds.

Bing Search

In the previous article you may have noticed that we were calling another Functions app to query Bing news for business / finance information:

var url = string.Format("http://finwin.azurewebsites.net/api/QueryBingNews");

This section of our project was built for an API layer for all news and information sources from third party services. We’ve created two API points in the Functions app:

Our chat bot will call these two API calls for particular news items related to a stock code or company.

Lets start by spinning up a Bing Search service on Azure, simply go to Create a resource and search for Bing Search v7. Its as easy as typing a name and specifying a pricing tier. Within seconds we will have a service with an API key to start calling the Bing News API. We can grab the API key from here:

Getting Started with the Bing Search API

Simply select the Quick Start item on the left of the Bing Search v7 service. It will display a list of hyperlinks to all the Bing API references. Select the Bing News Search API reference, this will provide you everything you need to call Bing Search for business or finance news items

Building an API layer with Functions

Lets create our own news API layer to sit in between to manage all news sources (internal an external) through a single API. Let’s jump into Visual studio and go to File->New->Project and search for Functions project. Create a new project and lets add a new function for call the Bing API.

Note

It may seem pointless creating another API layer to call another internal API, but we are trying to demonstrate an aggregation layer for a single news API that retrieves data from many different sources.

Right click on the Functions folder in the solution explorer and create a new function called QueryBingNews. Copy the following code:

using System;
using System.Threading.Tasks;
using System.IO;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Azure.WebJobs;
using Newtonsoft.Json;
using Finwin.Backend.Services;
namespace Finwin.Backend.Functions
{
   public static class QueryBingNews
   {
       [FunctionName(nameof(QueryBingNews))]
       public static async Task Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = nameof(QueryBingNews))]
                                       HttpRequest req)
       {
               try
               {
                   string query = req.Query["query"];
                   string requestBody = new StreamReader(req.Body).ReadToEnd();
                   dynamic data = JsonConvert.DeserializeObject(requestBody);
                   query = query ?? data?.Query;
                   using (var newsApi = new BingNewsService())
                   {
                       var news = await newsApi.Query(query);
                       using (var blobService = new BlobStorageService("news", "stock"))
                       {
                           foreach (var value in news.value)
                           {
                               value.StockCode = query;
                               await blobService.InsertItemAsync(value);
                           }
                       }
                       return new OkObjectResult("News Updated for query " + query);
                   }
               }
               catch (Exception e)
               {
                   analytic.TrackException(e);
                   return new BadRequestObjectResult(e);
               }
       }
   }
}

Its good practice to handle exceptions within each Function by wrapping the code in try and catch blocks. Basically the code takes a query string and calls the Bing News API. Once we receive the news items, it will then insert these news items into blob storage as an archive.

The blob storage service is going to act as an archive for news items. As we want this project to predict stocks, we want to keep a history of news and financials in an archival.

Adding a Blob Storage Service for archival

Creating a storage account gives us the ability to create containers for storing blobs.

Next, let’s add a simple archival for all news items called via the QueryBingNews API call. Jump back into Azure and select Create a resource and search for storage account.

Note

It maybe a little confusing at first navigating around a storage account but we are only focused on the keys section and blob service section. One storage account can be used to handle everything from blobs, files, tables, etc.

Inside the blob service we will be creating seperate containers to store archival for the project.

Once we have our storage account up and running. Its now time to jump back into the Functions project in Visual studio to build a service that will connect to our container and insert news items.

Create a new folder in our functions project called Services and create a new file called BlobStorageService. The BlobStorageService will inherit the IDisposable interface. We wrap our external calls to the blob storage service container in a single scope with a using block. This ensures all calls to the blob service are scoped individually with objects created inside the using block cleaned from memory. This means all news items are cleared from the Functions app resource. This is typical practice in .Net for good memory management.

Our first step is to create a connection to the storage account. We can do this in the constructor of the BlobStorageService:


string storageConnectionString = Environment.GetEnvironmentVariable("storage-account-connection-string");
if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
{
	EnsureContainerConfigured();
}

Storage accounts can have multiple connection strings depending on whether they are public or private. We can obtain our connection string here:

The environment variable is set in the Application Settings of the Functions service, we can configure this variable here:

 

Visit the Functions service Overview page and click Application Settings under Configured Features. 

Note

When testing locally simply copy and past the value from the storage account service.

Then we must create a CloudBlobClient that will connect to a storage account and create the container if it doesn’t exists. Once the container is created we then set an access permission to the container. In this case we are going to set public access to make things easier. We suggest creating another function called EnsureContainerConfigured to seperate the async logic from the constructor:

CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
cloudBlobContainer = cloudBlobClient.GetContainerReference("news");
await cloudBlobContainer.CreateAsync();
BlobContainerPermissions permissions = new BlobContainerPermissions
{
    PublicAccess = BlobContainerPublicAccessType.Blob
};
await cloudBlobContainer.SetPermissionsAsync(permissions);

If we are going to assign access permissions through the portal. You can do it through storage account blob service section. Right click on the container of interest and select Access Policy to assign the access permission.

Access policies allow public / private access to containers.

Now we need to add the InsertItemAsync function. In the BlobStorageService class add the following code:


public async Task InsertItemAsync(T item) where T : BaseModel
        {
            try
            {
                var uniqueBlobName = string.Format("news_{0}", Guid.NewGuid().ToString());

                CloudBlockBlob blob = _container.GetBlockBlobReference(uniqueBlobName);

                using (var ms = new MemoryStream())
                {
                    BinaryFormatter binSerializer = new BinaryFormatter();
                    binSerializer.Serialize(ms, item);
                    await blob.UploadFromStreamAsync(ms);
                }
            }
            catch (Exception exception)
            {

            }
        }

BinaryFormatter is used for serialisation of an object to create a memory used to upload to a blob. We could also create a byte[] from the the object as well. See more on blob storage upload here.

Thats all for the BlobStorageService class. Next we will see how we connect this new API back to our chat bot.

Posted in Blog, Learn