Options
All
  • Public
  • Public/Protected
  • All
Menu

Class DialogflowConversation<TConvData, TUserStorage, TContexts>

Type parameters

  • TConvData

  • TUserStorage

  • TContexts: Contexts

Hierarchy

Constructors

constructor

Properties

action

action: string

Get the current Dialogflow action name.

example

app.intent('Default Welcome Intent', conv => {
  const action = conv.action
})

arguments

arguments: Arguments

available

available: Available

body

canvas

canvas: Canvas

contexts

contexts: ContextValues<TContexts>

data

data: TConvData

The session data in JSON format. Stored using contexts.

example

app.intent('Default Welcome Intent', conv => {
  conv.data.someProperty = 'someValue'
})

device

device: Device

digested

digested: boolean = false

expectUserResponse

expectUserResponse: boolean = true

headers

headers: Headers

id

id: string

Gets the unique conversation ID. It's a new ID for the initial query, and stays the same until the end of the conversation.

example

app.intent('actions.intent.MAIN', conv => {
  const conversationId = conv.id
})

incoming

incoming: Incoming

input

input: Input

intent

intent: string

Get the current Dialogflow intent name.

example

app.intent('Default Welcome Intent', conv => {
  const intent = conv.intent // will be 'Default Welcome Intent'
})

noInputs

noInputs: (string | SimpleResponse)[] = []

Set reprompts when users don't provide input to this action (no-input errors). Each reprompt represents as the SimpleResponse, but raw strings also can be specified for convenience (they're passed to the constructor of SimpleResponse). Notice that this value is not kept over conversations. Thus, it is necessary to set the reprompts per each conversation response.

example

app.intent('actions.intent.MAIN', conv => {
  conv.noInputs = [
    'Are you still there?',
    'Hello?',
    new SimpleResponse({
      text: 'Talk to you later. Bye!',
      speech: '<speak>Talk to you later. Bye!</speak>'
    })
  ]
  conv.ask('What's your favorite color?')
})

parameters

parameters: Parameters

The Dialogflow parameters from the current intent. Values will only be a string, an Object, or undefined if not included.

Will also be sent via intent handler 3rd argument which is the encouraged method to retrieve.

example

// Encouraged method through intent handler
app.intent('Tell Greeting', (conv, params) => {
  const color = params.color
  const num = params.num
})

// Encouraged method through destructuring in intent handler
app.intent('Tell Greeting', (conv, { color, num }) => {
  // now use color and num as variables
}))

// Using conv.parameters
app.intent('Tell Greeting', conv => {
  const parameters = conv.parameters
  // or destructed
  const { color, num } = conv.parameters
})

query

query: string

The user's raw input query.

example

app.intent('User Input', conv => {
  conv.close(`You said ${conv.query}`)
})

request

responses

responses: Response[] = []

sandbox

sandbox: boolean

True if the app is being tested in sandbox mode. Enable sandbox mode in the Actions console to test transactions.

screen

screen: boolean

speechBiasing

speechBiasing: string[] = []

Sets speech biasing options.

example

app.intent('actions.intent.MAIN', conv => {
  conv.speechBiasing = ['red', 'blue', 'green']
  conv.ask('What is your favorite color out of red, blue, and green?')
})

surface

surface: Surface

type

type: Api.GoogleActionsV2ConversationType

user

user: User<TUserStorage>

Gets the User object. The user object contains information about the user, including a string identifier and personal information (requires requesting permissions, see conv.ask(new Permission)).

version

version: number

Methods

add

ask

  • Asks to collect user's input. All user's queries need to be sent to the app. The guidelines when prompting the user for a response must be followed at all times.

    example
    
    // Actions SDK
    const app = actionssdk()
    
    app.intent('actions.intent.MAIN', conv => {
      const ssml = '<speak>Hi! <break time="1"/> ' +
        'I can read out an ordinal like <say-as interpret-as="ordinal">123</say-as>. ' +
        'Say a number.</speak>'
      conv.ask(ssml)
    })
    
    app.intent('actions.intent.TEXT', (conv, input) => {
      if (input === 'bye') {
        return conv.close('Goodbye!')
      }
      const ssml = `<speak>You said, <say-as interpret-as="ordinal">${input}</say-as></speak>`
      conv.ask(ssml)
    })
    
    // Dialogflow
    const app = dialogflow()
    
    app.intent('Default Welcome Intent', conv => {
      conv.ask('Welcome to action snippets! Say a number.')
    })
    
    app.intent('Number Input', (conv, {num}) => {
      conv.close(`You said ${num}`)
    })

    Parameters

    • Rest responses: Response[]

      A response fragment for the library to construct a single complete response

    Returns this

close

  • Have Assistant render the speech response and close the mic.

    example
    
    // Actions SDK
    const app = actionssdk()
    
    app.intent('actions.intent.MAIN', conv => {
      const ssml = '<speak>Hi! <break time="1"/> ' +
        'I can read out an ordinal like <say-as interpret-as="ordinal">123</say-as>. ' +
        'Say a number.</speak>'
      conv.ask(ssml)
    })
    
    app.intent('actions.intent.TEXT', (conv, input) => {
      if (input === 'bye') {
        return conv.close('Goodbye!')
      }
      const ssml = `<speak>You said, <say-as interpret-as="ordinal">${input}</say-as></speak>`
      conv.ask(ssml)
    })
    
    // Dialogflow
    const app = dialogflow()
    
    app.intent('Default Welcome Intent', conv => {
      conv.ask('Welcome to action snippets! Say a number.')
    })
    
    app.intent('Number Input', (conv, {num}) => {
      conv.close(`You said ${num}`)
    })

    Parameters

    • Rest responses: Response[]

      A response fragment for the library to construct a single complete response

    Returns this

followup

  • followup(event: string, parameters: Parameters, lang: undefined | string): this
  • Triggers an intent of your choosing by sending a followup event from the webhook. Final response can theoretically include responses but these will not be handled by Dialogflow. Dialogflow will not pass anything back to Google Assistant, therefore Google Assistant specific information, most notably conv.user.storage, is ignored.

    example
    
    const app = dialogflow()
    
    // Create a Dialogflow intent with event 'apply-for-license-event'
    
    app.intent('Default Welcome Intent', conv => {
      conv.followup('apply-for-license-event', {
        date: new Date().toISOString(),
      })
      // The dialogflow intent with the 'apply-for-license-event' event
      // will be triggered with the given parameters `date`
    })

    Parameters

    • event: string

      Name of the event

    • Optional parameters: Parameters

      Parameters to send with the event

    • Optional lang: undefined | string

      The language of this query. See Language Support for a list of the currently supported language codes. Note that queries in the same session do not necessarily need to specify the same language. By default, it is the languageCode sent with Dialogflow's queryResult.languageCode

    Returns this

json

  • json<T>(json: T): this

response

  • response(): ConversationResponse

serialize

Generated using TypeDoc