Skip to content
This repository has been archived by the owner on Jun 30, 2022. It is now read-only.

Commit

Permalink
Update style on handbook/overviews to use ** instead of ` in classes,…
Browse files Browse the repository at this point in the history
… filenames
  • Loading branch information
ryanisgrig committed Oct 18, 2019
1 parent b80770c commit 35e2c57
Show file tree
Hide file tree
Showing 16 changed files with 144 additions and 160 deletions.
92 changes: 39 additions & 53 deletions docs/_docs/overview/virtual-assistant-template.md

Large diffs are not rendered by default.

14 changes: 7 additions & 7 deletions docs/_docs/skills/handbook/architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Within an Enterprise, this could be creating one parent bot bringing together mu

Skills are themselves Bots, invoked remotely and a Skill developer template (.NET, TS) is available to facilitate creation of new Skills.

A key design goal for Skills was to maintain the consistent Activity protocol and ensure the development experience was as close to any normal V4 SDK bot as possible. To that end, a Bot simply starts a `SkillDialog` which abstracts the skill invocation mechanics.
A key design goal for Skills was to maintain the consistent Activity protocol and ensure the development experience was as close to any normal V4 SDK bot as possible. To that end, a Bot simply starts a **SkillDialog** which abstracts the skill invocation mechanics.

## Invocation Flow

Expand All @@ -36,21 +36,21 @@ When the user of a Virtual Assistant asks a question, the Dispatcher will proces

> When testing a Virtual Assistant using the Emulator the SkillDialog surfaces Skill invocation and slot-filling telemetry.
On start-up of a Virtual Assistant, each registered Skill results in a SkillDialog instance being created which is associated with a `SkillManifest` instance containing details about the Skill including it's endpoint, actions and slots.
On start-up of a Virtual Assistant, each registered Skill results in a SkillDialog instance being created which is associated with a **SkillManifest** instance containing details about the Skill including it's endpoint, actions and slots.

All communication between a Virtual Assistant and a Skill is performed through a custom `SkillDialog`, which is started when the dispatcher identifies a Skill that maps to a users utterances. Skills are invoked through a lightweight `SkillWebSocket` or `SkillHttp` adapter, maintaining the standard Bot communication protocol and ensuring Skills can be developed using the standard Bot Framework toolkit.
All communication between a Virtual Assistant and a Skill is performed through a custom **SkillDialog**, which is started when the dispatcher identifies a Skill that maps to a users utterances. Skills are invoked through a lightweight **SkillWebSocket** or **SkillHttp** adapter, maintaining the standard Bot communication protocol and ensuring Skills can be developed using the standard Bot Framework toolkit.

The `SkillManifest` provides the endpoint for the SkillDialog to communicate with along with action and slot information. Slots are optional and a way to pass parameters to a Skill.
The **SkillManifest** provides the endpoint for the SkillDialog to communicate with along with action and slot information. Slots are optional and a way to pass parameters to a Skill.

When a Skill wants to terminate an ongoing dialog, it sends back an Activity with `Handoff` type to signal the completion of the current dialog.
When a Skill wants to terminate an ongoing dialog, it sends back an Activity with **Handoff** type to signal the completion of the current dialog.

See the [SkillAuthentication]({{site.baseurl}}/reference/skills/skillauthentication) section for information on how Bot->Skill invocation is secured.

## Skill Middleware

The `SkillMiddleware` is used by each Skill and is configured automatically if you use the Skill Template.
The **SkillMiddleware** is used by each Skill and is configured automatically if you use the Skill Template.

The middleware consumes the `skill/cancelallskilldialogs` event, when the Skill receives it it clears out the active dialog stack on that active Skill. This is useful in interruptions - i.e. if a user asks to cancel, a Virtual Assistant can send this event to the Skill and cancel the active dialog.
The middleware consumes the **skill/cancelallskilldialogs** event, when the Skill receives it it clears out the active dialog stack on that active Skill. This is useful in interruptions - i.e. if a user asks to cancel, a Virtual Assistant can send this event to the Skill and cancel the active dialog.

## Interrupting Active Skills

Expand Down
34 changes: 17 additions & 17 deletions docs/_docs/skills/handbook/authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,13 +38,13 @@ var credentials = new MicrosoftAppCredentialsEx(settings.MicrosoftAppId, setting
skillDialogs.Add(new SkillDialog(skill, credentials, telemetryClient, userState, authDialog));
```

The `MicrosoftAppCredentialsEx` class provided within the Microsoft.Bot.Builder.Skills package is the central place to manage the information needed for the skill to obtain the AAD token. Once you pass this into the SkillDialog, the SkillDialog will be able to use it to properly retrieve the AAD token. This behavior is the default behavior if you create a Virtual Assistant out of the Virtual Assistant Template VSIX.
The **MicrosoftAppCredentialsEx** class provided within the Microsoft.Bot.Builder.Skills package is the central place to manage the information needed for the skill to obtain the AAD token. Once you pass this into the SkillDialog, the SkillDialog will be able to use it to properly retrieve the AAD token. This behavior is the default behavior if you create a Virtual Assistant out of the Virtual Assistant Template VSIX.

## Whitelist Authentication

After the JWT token is verified, the Skill bot needs to verify if the request comes from a bot that's previously included in a whitelist. A Skill needs to have knowledge of it's callers and give permissions to that bot explicitly instead of any bot that could call the Skill. This level of authorization is enabled by default as well, making sure a Skill is well protected from public access. Developers need to do the following to implement the Whitelist mechanism:

Declare a class `WhiteListAuthProvider` in the bot service project that implements the interface `IWhitelistAuthenticationProvider`
Declare a class **WhiteListAuthProvider** in the bot service project that implements the interface **IWhitelistAuthenticationProvider**

```csharp
public HashSet<string> AppsWhitelist
Expand All @@ -61,14 +61,14 @@ public HashSet<string> AppsWhitelist

By adding the Microsoft App id of the Virtual Assistant that's calling the Skill into the property AppsWhitelist, you are allowing the bot that's associated with that app id to invoke your skill.

In `Startup.cs`, register a singleton of the interface with this class
In **Startup.cs**, register a singleton of the interface with this class

```csharp
// Register WhiteListAuthProvider
services.AddSingleton<IWhitelistAuthenticationProvider, WhiteListAuthProvider>();
```

In `BotController.cs` (derived from the `SkillController`, add the class as a new parameter to the constructor
In **BotController.cs** (derived from the **SkillController**, add the class as a new parameter to the constructor

```csharp
public BotController(
Expand All @@ -87,10 +87,10 @@ With all these changes in place, you're enabling your Skill to allow bots to inv

To ensure a standardized user experience across all Skills, the parent Bot is responsible for managing token requests. This helps to ensure that tokens common across multiple Skills can be shared and the user isn't prompted to authenticate for every Skill.
When a token isn't already cached (e.g. first time use) the following flow occurs:
- When a Skill requests a token, it asks the calling Bot for a token using an event called `tokens/request`
- The Skill starts an EventPRompt waiting for an Event to be returned called `tokens/response`
- When a Skill requests a token, it asks the calling Bot for a token using an event called **tokens/request**
- The Skill starts an EventPRompt waiting for an Event to be returned called **tokens/response**
- The Bot makes use of an OAuthPrompt to surface a prompt to the user
- When a token is retrieved it's returned to the Bot within a `tokens/response` activity, which is used to complete the OAuthPrompt and store the token securely
- When a token is retrieved it's returned to the Bot within a **tokens/response** activity, which is used to complete the OAuthPrompt and store the token securely
- The same event is then forwarded to the Skill through the SkillDialog on the stack and provides a token for the Skill to use

![Initial authentication flow for Skills]({{site.baseurl}}/assets/images/virtualassistant-SkillAuthInitialFlow.png)
Expand All @@ -107,29 +107,29 @@ If you wish to make use of the Calendar, Email and Task Skills standalone to the
The [Add Authentication to your bot](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-authentication?view=azure-bot-service-4.0&tabs=aadv1%2Ccsharp%2Cbot-oauth) section in the Azure Bot Service documentation covers more detail on how to configure Authentication. However in this scenario, the automated deployment step for the Skill has already created the **Azure AD v2 Application** for your Bot and you instead only need to follow these instructions:

- Navigate to the Azure Portal, Click Azure Active Directory and then `App Registrations`
- Navigate to the Azure Portal, Click Azure Active Directory and then **App Registrations**
- Find the Application that's been created for your Bot as part of the deployment. You can search for the application by name or ApplicationID as part of the experience but note that search only works across applications currently shown and the one you need may be on a separate page.
- Click API permissions on the left-hand navigation
- Select Add Permission to show the permissions pane
- Select `Microsoft Graph`
- Select **Microsoft Graph**
- Select Delegated Permissions and then add each of the following permissions required for the Productivity Skills you are adding (see the specific documentation page for the specific scopes required.)
- Click Add Permissions at the bottom to apply the changes.

Next you need to create the Authentication Connection for your Bot. Within the Azure Portal, find the `Web App Bot` resource created when your deployed your Bot and choose `Settings`.
Next you need to create the Authentication Connection for your Bot. Within the Azure Portal, find the **Web App Bot** resource created when your deployed your Bot and choose **Settings**.

- Scroll down to the oAuth Connection settings section.
- Click `Add Setting`
- Type in the name of your Connection Setting - e.g. `Outlook`
- Choose `Azure Active Directory v2` from the Service Provider drop-down
- Open the `appSettings.config` file for your Skill
- Copy/Paste the value of `microsoftAppId` into the ClientId setting
- Copy/Paste the value of `microsoftAppPassword` into the Client Secret setting
- Click **Add Setting**
- Type in the name of your Connection Setting - e.g. **Outlook**
- Choose **Azure Active Directory v2** from the Service Provider drop-down
- Open the **appSettings.config** file for your Skill
- Copy/Paste the value of **microsoftAppId** into the ClientId setting
- Copy/Paste the value of **microsoftAppPassword** into the Client Secret setting
- Set Tenant Id to common
- Set scopes to match the ones provided in the earlier step.

![Manual Auth Connection]({{site.baseurl}}/assets/images/manualauthconnection.png)

Finally, open the `appSettings.config` file for your Skill and update the connection name to match the one provided in the previous step.
Finally, open the **appSettings.config** file for your Skill and update the connection name to match the one provided in the previous step.

```
"oauthConnections": [
Expand Down
18 changes: 9 additions & 9 deletions docs/_docs/skills/handbook/best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,13 +38,13 @@ If there is an utterance that you expect would be applied to multiple Skills, ta
### Update LUIS model
{:.no_toc}

You can update you LUIS model in LUIS portal. Or modify the `.lu` file then convert it to `.json` and upload to LUIS portal manually, or use `update_cognitive_models.ps1`
You can update you LUIS model in LUIS portal. Or modify the **.lu** file then convert it to **.json** and upload to LUIS portal manually, or use **update_cognitive_models.ps1**

How to convert `.json` to `.lu`:
How to convert **.json** to **.lu**:
```bash
ludown refresh -i YOUR_BOT_NAME.json
```
How to convert `.lu` to `.json`:
How to convert **.lu** to **.json**:
```bash
ludown parse toluis --in YOUR_BOT_NAME.lu
```
Expand All @@ -68,7 +68,7 @@ Consider the multiple layers of communication a user may have with a Skill on th
#### Speech & Text
{:.no_toc}

Speech & Text responses are stored in `.json` files, and offer the ability to provide a variety of responses and set the input hint on each Activity.
Speech & Text responses are stored in **.json** files, and offer the ability to provide a variety of responses and set the input hint on each Activity.

```json
{
Expand Down Expand Up @@ -97,14 +97,14 @@ Speech & Text responses are stored in `.json` files, and offer the ability to pr
}
```

Vary your responses. By providing additional utterances to the `replies` array, your Skill will sound more natural and provide a dynamic conversation.
Vary your responses. By providing additional utterances to the **replies** array, your Skill will sound more natural and provide a dynamic conversation.

Write how people speak. A skill should only provide relevant context when read aloud. Use visual aids to offer more data to a user.

#### Common string
{:.no_toc}

Some common strings shouldn't save in response file. Suggest you to save them in `.resx` file. It is easy to be localized.
Some common strings shouldn't save in response file. Suggest you to save them in **.resx** file. It is easy to be localized.

#### Visual
{:.no_toc}
Expand Down Expand Up @@ -452,7 +452,7 @@ protected async Task<bool> ChoiceValidator(PromptValidatorContext<FoundChoice> p
}
```

If you need a more complex prompt you can implement it by inheriting `Microsoft.Bot.Builder.Dialogs.Prompt<T>`. Or read [Create your own prompts to gather user input](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-primitive-prompts?view=azure-bot-service-4.0&tabs=csharp) to learn more about custom prompt.
If you need a more complex prompt you can implement it by inheriting **Microsoft.Bot.Builder.Dialogs.Prompt<T>**. Or read [Create your own prompts to gather user input](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-primitive-prompts?view=azure-bot-service-4.0&tabs=csharp) to learn more about custom prompt.

### Enable long running tasks
{:.no_toc}
Expand All @@ -463,7 +463,7 @@ This enables a Skill to have more intelligent interactions with a user, triggere
### Handle and log errors
{:.no_toc}

Use the `HandleDialogExceptions` method in [SkillDialogBase.cs]({{site.repo}}/blob/master/templates/Skill-Template/csharp/Sample/SkillSample/Dialogs/SkillDialogBase.cs) to send a trace back to the [Bot Framework Emulator](https://aka.ms/botframework-emulator), logging the exception, and sending a friendly error response to the user.
Use the **HandleDialogExceptions** method in [SkillDialogBase.cs]({{site.repo}}/blob/master/templates/Skill-Template/csharp/Sample/SkillSample/Dialogs/SkillDialogBase.cs) to send a trace back to the [Bot Framework Emulator](https://aka.ms/botframework-emulator), logging the exception, and sending a friendly error response to the user.

```csharp
protected async Task HandleDialogExceptions(WaterfallStepContext sc, Exception ex)
Expand All @@ -489,7 +489,7 @@ protected async Task HandleDialogExceptions(WaterfallStepContext sc, Exception e

Save your data in different scope of states. Read [Save user and conversation data](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-howto-v4-state?view=azure-bot-service-4.0&tabs=csharp) to learn about user and conversation state.

For dialog state, you can save your data in `stepContext.State.Dialog[YOUR_DIALOG_STATE_KEY]`.
For dialog state, you can save your data in **stepContext.State.Dialog[YOUR_DIALOG_STATE_KEY]**.

### Manage the dialogs
{:.no_toc}
Expand Down
8 changes: 4 additions & 4 deletions docs/_docs/skills/handbook/manifest.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Skill manifest enables Skills to be self-describing in that they communicate

This manifest provides all of the metadata required for a calling Bot to know when to trigger invoking a skill and what actions it provides. The manifest is used by the Skill command-line tool to configure a Bot to make use of a Skill.

Each skill exposes a manifest endpoint enabling easy retrieval of a manifest, this can be found on the following URI path of your skill: `/api/skill/manifest`
Each skill exposes a manifest endpoint enabling easy retrieval of a manifest, this can be found on the following URI path of your skill: **/api/skill/manifest**

## Manifest structure

Expand Down Expand Up @@ -55,7 +55,7 @@ The manifest header provides high level information relating to your skill, the
### Authentication Connections
{:.no_toc}

The `authenticationConnections` section communicates which authentication providers your skill supports, if any. For example, a Calendar skill might support both Outlook and Google enabling it to function with either provider depending on the users choice. The caller can then use this information to automatically configure the Authentication connection or as required enable a manual step to be performed.
The **authenticationConnections** section communicates which authentication providers your skill supports, if any. For example, a Calendar skill might support both Outlook and Google enabling it to function with either provider depending on the users choice. The caller can then use this information to automatically configure the Authentication connection or as required enable a manual step to be performed.

Parameter | Description | Required
--------- | ----------- | --------
Expand All @@ -81,7 +81,7 @@ The `authenticationConnections` section communicates which authentication provid
### Actions
{:.no_toc}

The `actions` section describes the discrete actions (features) that a given Skill supports. Each action can optionally provide slots (parameters) that the caller may choose to pass or alternatively omit and pass the utterance for the Skill to perform it's own slot filling. Slot filling on the client side can enable a Skill to be invoked and not require any further input or turns from the end user.
The **actions** section describes the discrete actions (features) that a given Skill supports. Each action can optionally provide slots (parameters) that the caller may choose to pass or alternatively omit and pass the utterance for the Skill to perform it's own slot filling. Slot filling on the client side can enable a Skill to be invoked and not require any further input or turns from the end user.

Parameter | Description | Required
--------- | ----------- | --------
Expand Down Expand Up @@ -153,7 +153,7 @@ Utterances can also be provided in-line with the skill manifest as shown below.
}
```

Both `utteranceSources` and `utterances` support multiple-locales enabling you to express the locales your Skill supports.
Both **utteranceSources** and **utterances** support multiple-locales enabling you to express the locales your Skill supports.

### Example Skill Manifest
{:.no_toc}
Expand Down
Loading

0 comments on commit 35e2c57

Please sign in to comment.