In Voice Bot Assistance Part 1 - Microsoft.CognitiveServices.Speech configuration on Azure , I was able to set my Speech from my laptop Microphone to text and also this text was able to play back on my laptop's speaker. But now what I want is to set up a Bot who can answer my queries asked.
I did research and found there are many Chat Bot is available you can use like BotSharp and Microsoft BOT. But I want to create my own Bot which answer through my Knowledge base hence I have selected again a Microsoft Azure service - QnA maker.
You can easily set this up by clicking here.
I created a QNA Maker service resource, than created a Knowledge base and uploaded a sample chit chat file. Once it is ready, Click on Edit button and note these values :
These values are required to make a call from your code.
So once this is done you can POST a question to get answer. Below is the code which I used for getting the answer:
So now my workflow for Voice BOT assistance is transitioning like this :
I did research and found there are many Chat Bot is available you can use like BotSharp and Microsoft BOT. But I want to create my own Bot which answer through my Knowledge base hence I have selected again a Microsoft Azure service - QnA maker.
You can easily set this up by clicking here.
I created a QNA Maker service resource, than created a Knowledge base and uploaded a sample chit chat file. Once it is ready, Click on Edit button and note these values :
These values are required to make a call from your code.
So once this is done you can POST a question to get answer. Below is the code which I used for getting the answer:
public async static Task<string> GetAnswer(string question)
{
var uri = endpoint_host + endpointService + baseRoute + "/" + kbid + "/generateAnswer";
//Console.WriteLine("Get answers " + uri + ".");
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(uri);
request.Content = new StringContent("{question:'" + question + "'}", Encoding.UTF8, "application/json");
// NOTE: The value of the header contains the string/text 'EndpointKey ' with the trailing space
request.Headers.Add("Authorization", "EndpointKey " + endpoint_key);
var response = await client.SendAsync(request);
var responseBody = await response.Content.ReadAsStringAsync();
//Console.WriteLine(PrettyPrint(responseBody));
QNAResponse qnaResponse = JsonConvert.DeserializeObject<QNAResponse>(responseBody);
return qnaResponse.answers[0].answer;
//return responseBody.
}
}
So now my workflow for Voice BOT assistance is transitioning like this :

The whole code is published at below location :
https://github.com/LALITAMITTAL18/VoiceBotAssistance
Make sure to change the configuration values of your Azure Services subscription to see your BOT working.
Thank you for reading and enjoy your chit chat BOT :-)
https://github.com/LALITAMITTAL18/VoiceBotAssistance
Make sure to change the configuration values of your Azure Services subscription to see your BOT working.
Thank you for reading and enjoy your chit chat BOT :-)