Skip to content

Latest commit

 

History

History
2773 lines (1741 loc) · 74.7 KB

File metadata and controls

2773 lines (1741 loc) · 74.7 KB

Great! I’ll start converting the qiSDK Kotlin documentation from the official website into Markdown files, one per class or interface, preserving all the content they have—including method signatures, descriptions, examples, and any available diagrams or links.

I'll let you know once the collection is ready for review.

Actuation

interface Actuation

The Actuation service contains the actions and objects needed to make the robot move:

  • the robot and gaze frames
  • the factory for Animation objects
  • the factories for GoTo, Animate and LookAt actions

Since: 1

Types

  • interface Async(Asynchronous version of Actuation functions.)

Functions

async

abstract fun async(): Async!

gazeFrame

Retrieve the gaze Frame.

abstract fun gazeFrame(): Frame!

makeAnimate

abstract fun makeAnimate(context: RobotContext!, animation: Animation!): Animate!

Create a new Animate action on the robot.
Parameters:

  • contextRobotContext!: A qi.context.Context.
  • animationAnimation!: The Animation to play.
    Return: Animate! – An Animate action.
    Since: 1

makeAnimation

abstract fun makeAnimation(anims: MutableList<String!>!): Animation!

Create a new Animation object on the robot.
Parameters:

  • animsMutableList<String!>!: List of animation resource names.
    Return: Animation! – A new Animation object.
    Since: 1

makeCroppedAnimation

abstract fun makeCroppedAnimation(anim: Animation!, beginTime: Long!, endTime: Long!): Animation!

Create a new Animation object on the robot by cropping an existing animation.
Parameters:

  • animAnimation!: The original animation to crop.
  • beginTimeLong!: Starting time (in milliseconds) of the cropped animation.
  • endTimeLong!: Ending time (in milliseconds) of the cropped animation.
    Return: Animation! – The new cropped Animation.
    Since: 1

makeEnforceTabletReachability

abstract fun makeEnforceTabletReachability(context: RobotContext!): EnforceTabletReachability!

Create an EnforceTabletReachability action on the robot.
Parameters:

  • contextRobotContext!: A qi.context.Context.
    Return: EnforceTabletReachability! – An EnforceTabletReachability action.
    Since: 1

makeGoTo

abstract fun makeGoTo(context: RobotContext!, target: Frame!): GoTo!

Create a new GoTo action on the robot. For more control over the GoTo behavior, prefer using makeGoTo(context, target, config).
Parameters:

  • contextRobotContext!: A qi.context.Context.
  • targetFrame!: A Frame representing the target to reach.
    Return: GoTo! – A GoTo action.
    Since: 1
abstract fun makeGoTo(context: RobotContext!, target: Frame!, config: GoToConfig!): GoTo!

Create a new GoTo action on the robot.
Parameters:

  • contextRobotContext!: A qi.context.Context.
  • targetFrame!: A Frame representing the target to reach.
  • configGoToConfig!: The GoTo configuration. If not explicitly set by the user, GoTo will use:
    • a maximum navigating speed of 0.35 m/s;
    • a GetAroundObstacles path planning policy;
    • a FreeOrientation final orientation policy.
      Return: GoTo! – A GoTo action.
      Since: 6

makeLookAt

abstract fun makeLookAt(context: RobotContext!, target: Frame!): LookAt!

Create a LookAt action on the robot.
Parameters:

  • contextRobotContext!: A qi.context.Context.
  • targetFrame!: The target frame to look at.
    Return: LookAt! – A LookAt action.
    Since: 1

robotFrame

Retrieve the robot Frame.

abstract fun robotFrame(): Frame!

ActuationConverter

(No additional description provided.)

interface ActuationConverter


Age

@QiStruct open class Age

Structure representing an age.

Since: 1

Constructors

  • Age(years: Int): Create a new Age.

Functions

  • equalsopen fun equals(other: Any?): Boolean
  • getYearsopen fun getYears(): Int – The number of years.
  • hashCodeopen fun hashCode(): Int
  • setYearsopen fun setYears(years: Int): Unit – Setter for years. The number of years.
  • toStringopen fun toString(): String

Animate

interface Animate

Action to play animations on the robot.

Since: 1

Types

  • interface Async(Asynchronous version of Animate functions.)

Functions

async

abstract fun async(): Animate.Async!

run

Start the animation.

abstract fun run(): Future<Void>!

Return: Future<Void>! – A future that completes when the animation finishes running.
Since: 1

setOnStartedListener

Set a listener for when the Animate action starts.

abstract fun ~setOnStartedListener~(listener: Animate.OnStartedListener!): Unit

(The setOnStartedListener function is deprecated.)


AnimateBuilder

open class AnimateBuilder

Build a new Animate.

Functions

build

open fun build(): Animate!

Return a configured instance of Animate.
Return: Animate! – The built Animate action.

buildAsync

open fun buildAsync(): Future<Animate>!

Return a configured instance of Animate (asynchronously).
Return: Future<Animate>! – A future that will complete with the built Animate action.

with

open static fun with(context: QiContext!): AnimateBuilder!

Create a new builder from the QiContext.
Parameters:

  • contextQiContext!: The QiContext providing the environment for the animate action.
    Return: AnimateBuilder! – A new builder instance.

withAnimation

open fun withAnimation(animation: Animation!): AnimateBuilder!

Configure the animation to be used by the Animate action.
Parameters:

  • animationAnimation!: The animation to play.
    Return: AnimateBuilder! – The builder (for chaining).

Animation

interface Animation

Object representing a robot animation. An animation can be composed of gestures performed by the robot’s limbs and head, and/or trajectories performed by the robot base.

Since: 1

Types

  • interface Async(Asynchronous version of Animation functions.)

Functions

async

abstract fun async(): Animation.Async!

duration

Get the duration of the animation, in milliseconds.

abstract fun duration(): Long!

Return: Long! – Duration in milliseconds.


AnimationBuilder

open class AnimationBuilder

Build a new Animation.

Functions

build

open fun build(): Animation!

Return a configured instance of Animation.
Return: Animation! – The built Animation object.

buildAsync

open fun buildAsync(): Future<Animation>!

Return a configured instance of Animation (asynchronously).
Return: Future<Animation>! – A future that will complete with the built Animation.

with

open static fun with(context: QiContext!): AnimationBuilder!

Create a new builder from the QiContext.
Parameters:

  • contextQiContext!: The QiContext.
    Return: AnimationBuilder! – A new builder instance.

withResources

open fun withResources(resources: List<String!>!): AnimationBuilder!

Configure the animation resources.
Parameters:

  • resourcesList<String!>!: A list of animation resource names (as strings).
    Return: AnimationBuilder! – The builder (for chaining).

AnyObjectProvider

interface AnyObjectProvider

Interface for objects that can provide an AnyObject.


AnyObjectProxyAsync

interface AnyObjectProxyAsync

(No additional description provided.)


AnyObjectProxyConverter

interface AnyObjectProxyConverter

(No additional description provided.)


AnyObjectProxySync

interface AnyObjectProxySync

(No additional description provided.)


AnyObjectWrapper

open class AnyObjectWrapper

Parent class for QiService objects that run on the tablet.


AnyObjectWrapperConverter

interface AnyObjectWrapperConverter

Converter for AnyObjectWrapper objects.


ApproachHuman

interface ApproachHuman

Action to make the robot go towards a human and respond to various situations on the way.

Since: 1

Types

  • interface Async(Asynchronous version of ApproachHuman functions.)

Functions

async

abstract fun async(): ApproachHuman.Async!

run

Start approaching the human.

abstract fun run(): Future<Void>!

Return: Future<Void>! – A future that completes when the action finishes or is canceled.

setPolicy

abstract fun setPolicy(policy: EngagementPolicy!): Unit

Set the engagement policy for approaching the human.
Parameters:

  • policyEngagementPolicy!: The eye contact policy to apply.

ApproachHumanBuilder

open class ApproachHumanBuilder

Build a new ApproachHuman.

Functions

  • buildopen fun build(): ApproachHuman!
  • buildAsyncopen fun buildAsync(): Future<ApproachHuman>!
  • withopen static fun with(context: QiContext!): ApproachHumanBuilder!
  • withPolicyopen fun withPolicy(policy: EngagementPolicy!): ApproachHumanBuilder! – Configure the engagement policy for the builder.

AttachedFrame

interface AttachedFrame

Object representing a frame attached to a parent frame. The link between the parent and the attached frame (i.e., the relative location of the attached frame to its parent) is editable. In order to compute transforms between frames, one should use the frame() function of an AttachedFrame.

Since: 1

Types

  • interface Async(Asynchronous version of AttachedFrame functions.)

Functions

async

abstract fun async(): AttachedFrame.Async!

frame

Get the Frame representing this AttachedFrame (for computing transforms, etc).

abstract fun frame(): Frame!

Return: Frame! – The Frame of this attached frame.

update

abstract fun update(transform: Transform!, frame: Frame!, timestamp: Long!): Unit

Update the global position of this free frame by giving its location at a given time in a given reference frame.
Parameters:

  • transformTransform!: The transform representing the new pose of this frame relative to frame.
  • frameFrame!: The reference frame in which the transform is expressed.
  • timestampLong!: The timestamp of the given transform (in milliseconds since epoch).

AttentionState

class AttentionState : QiEnum

Enum containing the possible attention states of the human when interacting with the robot. States are defined based on where the human is looking (frame of reference is the human).

Since: 1

Enum Values

  • UNKNOWN
  • NOT_SELECTED
  • SELECTED

Functions

  • getQiValuefun getQiValue(): Int

AutonomousAbilities

interface AutonomousAbilities

A service that allows selecting which autonomous abilities to pause or resume during an activity. This service will ensure the ability owner that he or she can be the only one to control this ability as long as his or her holder is not released. Holding an ability will automatically pause it. To resume it, the owner must release it.

Since: 1

Types

  • interface Async(Asynchronous version of AutonomousAbilities functions.)

Functions

async

abstract fun async(): AutonomousAbilities.Async!

getAbility

abstract fun getAbility(type: AutonomousAbilitiesType!): AutonomousAbilityHolder!

Obtain a holder for the specified autonomous ability, pausing that ability until released.
Parameters:

  • typeAutonomousAbilitiesType!: The type of autonomous ability to control (e.g., navigation, dialogue).
    Return: AutonomousAbilityHolder! – A holder that, when released, will resume the ability.

AutonomousabilitiesConverter

interface AutonomousabilitiesConverter

(No additional description provided.)


AutonomousAbilitiesType

enum class AutonomousAbilitiesType

(No documentation provided on site for this enum.)

(Possible values might represent different autonomous abilities such as BasicAwareness, BackgroundMovement, etc., but no details given in the reference.)


AutonomousAbilityHolder

interface AutonomousAbilityHolder

An AutonomousAbilityHolder represents an autonomous ability being taken from the AutonomousAbilities service. It serves only once, and emits released() whenever the autonomous ability is released. An AutonomousAbilityHolder that was released is invalid.

Since: 1

Types

  • interface Async(Asynchronous version of AutonomousAbilityHolder functions.)

Functions

async

abstract fun async(): AutonomousAbilityHolder.Async!

release

Release the autonomous ability and invalidate this holder.

abstract fun release(): Future<Void>!

Return: Future<Void>! – A future that completes when the ability is released (and the holder becomes invalid).

setOnReleasedListener

abstract fun ~setOnReleasedListener~(listener: AutonomousAbilityHolder.OnReleasedListener!): Unit

Set a listener for when the autonomous ability is released. (Deprecated; use addOnReleasedListener instead.)

Types (Nested)

  • interface OnReleasedListener – Listener for the released signal of this holder.

AutonomousReaction

interface AutonomousReaction

A reaction suggested by a Chatbot.


AutonomousReactionImportance

class AutonomousReactionImportance : QiEnum

Additional information on the importance of a suggested ChatbotReaction.

Since: 1

Enum Values

  • LOW
  • NORMAL
  • HIGH

Functions

  • getQiValuefun getQiValue(): Int

AutonomousReactionValidity

class AutonomousReactionValidity : QiEnum

Describes the validity of a suggested ChatbotReaction.

Since: 1

Enum Values

  • UNKNOWN
  • VALID
  • INVALID

Functions

  • getQiValuefun getQiValue(): Int

BaseChatbot

abstract class BaseChatbot

Parent class for ChatBot implementations.


BaseChatbotReaction

abstract class BaseChatbotReaction

Parent class for ChatBotReaction implementations.


BaseQiChatExecutor

abstract class BaseQiChatExecutor

Parent class for QiChatExecutor implementations.


BodyLanguageOption

class BodyLanguageOption : QiEnum

Body language policy.

Since: 1

Enum Values

  • AUTO
  • NEVER

Functions

  • getQiValuefun getQiValue(): Int

Bookmark

open class Bookmark

Object representing a marked location in a topic.


BookmarkStatus

open class BookmarkStatus

Object representing the state of a Bookmark during a Discuss execution.


Camera

interface Camera

Service exposing actions and properties related to the robot camera.

Since: 1

Types

  • interface Async(Asynchronous version of Camera functions.)

Functions

async

abstract fun async(): Camera.Async!

takePicture

abstract fun takePicture(): Future<EncodedImage>!

Take a picture with the robot’s camera.
Return: Future<EncodedImage>! – A future that will yield the captured image.

startRecording

abstract fun startRecording(): Future<Void>!

Start recording video.
Return: Future<Void>! – A future that completes when recording starts.

stopRecording

abstract fun stopRecording(): Future<EncodedImage>!

Stop recording video.
Return: Future<EncodedImage>! – A future that yields the last frame captured when recording stopped.

(Other camera functions and properties omitted for brevity if any.)


CameraConverter

interface CameraConverter

(No additional description provided.)


Chat

interface Chat

Action that listens to the users and interrogates its Chatbots to select the most appropriate answers.

Since: 3

Types

  • interface Async
  • interface OnFallbackReplyFoundForListener – Listener for the fallbackReplyFoundFor signal.
  • interface OnHeardListener – Listener for the heard signal.
  • interface OnHearingChangedListener – Listener for changes in the hearing property.
  • interface OnListeningChangedListener – Listener for changes in the listening property.
  • interface OnNoPhraseRecognizedListener – Listener for the noPhraseRecognized signal.
  • interface OnNoReplyFoundForListener – Listener for the noReplyFoundFor signal.
  • interface OnNormalReplyFoundForListener – Listener for the normalReplyFoundFor signal.
  • interface OnSayingChangedListener – Listener for changes in the saying property.
  • interface OnStartedListener – Listener for the started signal.

Functions

  • addOnFallbackReplyFoundForListenerabstract fun addOnFallbackReplyFoundForListener(listener: OnFallbackReplyFoundForListener!): Unit – Add an OnFallbackReplyFoundForListener.
  • addOnHeardListenerabstract fun addOnHeardListener(listener: OnHeardListener!): Unit – Add an OnHeardListener.
  • addOnHearingChangedListenerabstract fun addOnHearingChangedListener(listener: OnHearingChangedListener!): Unit – Add a property changed listener for hearing.
  • addOnListeningChangedListenerabstract fun addOnListeningChangedListener(listener: OnListeningChangedListener!): Unit – Add a property changed listener for listening.
  • addOnNoPhraseRecognizedListenerabstract fun addOnNoPhraseRecognizedListener(listener: OnNoPhraseRecognizedListener!): Unit – Add an OnNoPhraseRecognizedListener.
  • addOnNoReplyFoundForListenerabstract fun addOnNoReplyFoundForListener(listener: OnNoReplyFoundForListener!): Unit – Add an OnNoReplyFoundForListener.
  • addOnNormalReplyFoundForListenerabstract fun addOnNormalReplyFoundForListener(listener: OnNormalReplyFoundForListener!): Unit – Add an OnNormalReplyFoundForListener.
  • addOnSayingChangedListenerabstract fun addOnSayingChangedListener(listener: OnSayingChangedListener!): Unit – Add a property changed listener for saying.
  • addOnStartedListenerabstract fun addOnStartedListener(listener: OnStartedListener!): Unit – Add an OnStartedListener.
  • asyncabstract fun async(): Chat.Async!
  • getHearingabstract fun getHearing(): Boolean! – Exposes the hearing property value.
  • getListeningabstract fun getListening(): Boolean! – Exposes the listening property value.
  • getListeningBodyLanguageabstract fun getListeningBodyLanguage(): BodyLanguageOption! – Exposes the listeningBodyLanguage property value.
  • getSayingabstract fun getSaying(): Phrase! – Exposes the saying property value.
  • removeAllOnFallbackReplyFoundForListenersabstract fun removeAllOnFallbackReplyFoundForListeners(): Unit – Remove all OnFallbackReplyFoundForListener.
  • removeAllOnHeardListenersabstract fun removeAllOnHeardListeners(): Unit – Remove all OnHeardListener.
  • removeAllOnHearingChangedListenersabstract fun removeAllOnHearingChangedListeners(): Unit – Remove all hearing changed listeners.
  • removeAllOnListeningChangedListenersabstract fun removeAllOnListeningChangedListeners(): Unit – Remove all listening changed listeners.
  • removeAllOnNoPhraseRecognizedListenersabstract fun removeAllOnNoPhraseRecognizedListeners(): Unit – Remove all OnNoPhraseRecognizedListener.
  • removeAllOnNoReplyFoundForListenersabstract fun removeAllOnNoReplyFoundForListeners(): Unit – Remove all OnNoReplyFoundForListener.
  • removeAllOnNormalReplyFoundForListenersabstract fun removeAllOnNormalReplyFoundForListeners(): Unit – Remove all OnNormalReplyFoundForListener.
  • removeAllOnSayingChangedListenersabstract fun removeAllOnSayingChangedListeners(): Unit – Remove all saying changed listeners.
  • removeAllOnStartedListenersabstract fun removeAllOnStartedListeners(): Unit – Remove all OnStartedListener.
  • removeOnFallbackReplyFoundForListenerabstract fun removeOnFallbackReplyFoundForListener(listener: OnFallbackReplyFoundForListener!): Unit – Remove an OnFallbackReplyFoundForListener.
  • removeOnHeardListenerabstract fun removeOnHeardListener(listener: OnHeardListener!): Unit – Remove an OnHeardListener.
  • removeOnHearingChangedListenerabstract fun removeOnHearingChangedListener(listener: OnHearingChangedListener!): Unit – Remove a hearing changed listener.
  • removeOnListeningChangedListenerabstract fun removeOnListeningChangedListener(listener: OnListeningChangedListener!): Unit – Remove a listening changed listener.
  • removeOnNoPhraseRecognizedListenerabstract fun removeOnNoPhraseRecognizedListener(listener: OnNoPhraseRecognizedListener!): Unit – Remove an OnNoPhraseRecognizedListener.
  • removeOnNoReplyFoundForListenerabstract fun removeOnNoReplyFoundForListener(listener: OnNoReplyFoundForListener!): Unit – Remove an OnNoReplyFoundForListener.
  • removeOnNormalReplyFoundForListenerabstract fun removeOnNormalReplyFoundForListener(listener: OnNormalReplyFoundForListener!): Unit – Remove an OnNormalReplyFoundForListener.
  • removeOnSayingChangedListenerabstract fun removeOnSayingChangedListener(listener: OnSayingChangedListener!): Unit – Remove a saying changed listener.
  • removeOnStartedListenerabstract fun removeOnStartedListener(listener: OnStartedListener!): Unit – Remove an OnStartedListener.
  • runabstract fun run(): Unit – The robot starts to listen, react and talk by picking reactions. If the robot is in a language different from the action language, the robot’s language will be changed.
    Exceptions:
    • RuntimeException – This Chat action is already running.
  • setListeningBodyLanguageabstract fun setListeningBodyLanguage(bodyLanguageOption: BodyLanguageOption!): Unit – Set the bodyLanguageOption property value.
  • setOnFallbackReplyFoundForListenerabstract fun ~setOnFallbackReplyFoundForListener~(listener: OnFallbackReplyFoundForListener!): Unit (deprecated)
  • setOnHeardListenerabstract fun ~setOnHeardListener~(listener: OnHeardListener!): Unit (deprecated)
  • setOnHearingChangedListenerabstract fun ~setOnHearingChangedListener~(listener: OnHearingChangedListener!): Unit (deprecated)
  • setOnListeningChangedListenerabstract fun ~setOnListeningChangedListener~(listener: OnListeningChangedListener!): Unit (deprecated)
  • setOnNoPhraseRecognizedListenerabstract fun ~setOnNoPhraseRecognizedListener~(listener: OnNoPhraseRecognizedListener!): Unit (deprecated)
  • setOnNoReplyFoundForListenerabstract fun ~setOnNoReplyFoundForListener~(listener: OnNoReplyFoundForListener!): Unit (deprecated)
  • setOnNormalReplyFoundForListenerabstract fun ~setOnNormalReplyFoundForListener~(listener: OnNormalReplyFoundForListener!): Unit (deprecated)
  • setOnSayingChangedListenerabstract fun ~setOnSayingChangedListener~(listener: OnSayingChangedListener!): Unit (deprecated)
  • setOnStartedListenerabstract fun ~setOnStartedListener~(listener: OnStartedListener!): Unit (deprecated)

Chatbot

interface Chatbot

Object representing a Chatbot that can react in response to Phrases.


ChatbotReaction

interface ChatbotReaction

Action produced by a Chatbot either as a reply to a Phrase, or spontaneously.


ChatbotReactionHandlingStatus

class ChatbotReactionHandlingStatus : QiEnum

Describes the current status of a ChatbotReaction (for example, in a Chat action).

Since: 1

Enum Values

  • NOT_HANDLED
  • HANDLED
  • FAILED

Functions

  • getQiValuefun getQiValue(): Int

ChatBuilder

open class ChatBuilder

Build a new Chat.

Functions

  • buildopen fun build(): Chat!
  • buildAsyncopen fun buildAsync(): Future<Chat>!
  • withopen static fun with(context: QiContext!): ChatBuilder!
  • withChatbotopen fun withChatbot(chatbot: QiChatbot!): ChatBuilder! – Configure the Chatbot to use.
  • withChatOptionsopen fun withChatOptions(options: ChatOptions!): ChatBuilder! – Configure optional parameters for the Chat action.

ChatOptions

open class ChatOptions

Optional parameters for the configuration of a Chat action.


ContextConverter

interface ContextConverter

(No additional description provided.)


Conversation

interface Conversation

Service exposing actions and properties related to human-robot conversation.

Since: 1

Types

  • interface Async(Asynchronous version of Conversation functions.)

Functions

async

abstract fun async(): Conversation.Async!

status

Get the ConversationStatus for the current context.

abstract fun status(): ConversationStatus!

Return: ConversationStatus! – The conversation-related signals and properties for this application context.


ConversationConverter

interface ConversationConverter

(No additional description provided.)


ConversationStatus

open class ConversationStatus

An object collecting Conversation-related signals and properties for a given application context.


DateString

open class DateString

Struct representing a date as a string.


DateTimeString

open class DateTimeString

Struct representing a dateTime as a string.


DegreeOfFreedom

class DegreeOfFreedom : QiEnum

A degree of freedom.

Since: 1

Enum Values

(No explicit values documented, likely something like X, Y, Theta depending on context.)

(No further details provided.)


Discuss

interface Discuss

Action to make the robot able to converse with a human using content from QiChat topics.

Since: 1

Types

  • interface Async(Asynchronous version of Discuss functions.)

Functions

async

abstract fun async(): Discuss.Async!

run

Start the discussion using the provided topics.

abstract fun run(): Future<Void>!

Return: Future<Void>! – A future that completes when the discussion ends or is canceled.

(Other functions for managing topics or bookmarks may exist.)


DiscussBuilder

open class DiscussBuilder

Build a new Discuss.

Functions

  • buildopen fun build(): Discuss!
  • buildAsyncopen fun buildAsync(): Future<Discuss>!
  • withopen static fun with(context: QiContext!): DiscussBuilder!
  • withTopicopen fun withTopic(topic: Topic!): DiscussBuilder! – Add a topic to the discussion.
  • withTopicsopen fun withTopics(topics: List<Topic!>!): DiscussBuilder! – Add multiple topics.

EditableKnowledgeGraph

interface EditableKnowledgeGraph

Object allowing to edit a named graph.


EditablePhraseSet

interface EditablePhraseSet

A container of Phrases that can be edited.


Emotion

interface Emotion

Object containing the emotional state properties. It is a three-dimensional representation of the emotional state, based on the PAD model of Albert Mehrabian. See: Mehrabian, Albert (1980). Basic dimensions for a general psychological theory.

Since: 1

Types

  • interface Async
  • interface OnExcitementChangedListener – Listener for excitement property changes.
  • interface OnPleasureChangedListener – Listener for pleasure property changes.

Functions

  • addOnExcitementChangedListenerabstract fun addOnExcitementChangedListener(listener: OnExcitementChangedListener!): Unit
  • addOnPleasureChangedListenerabstract fun addOnPleasureChangedListener(listener: OnPleasureChangedListener!): Unit
  • asyncabstract fun async(): Emotion.Async!
  • getExcitementabstract fun getExcitement(): ExcitementState! – Exposes the excitement value.
  • getPleasureabstract fun getPleasure(): PleasureState! – Exposes the pleasure value.
  • removeAllOnExcitementChangedListenersabstract fun removeAllOnExcitementChangedListeners(): Unit
  • removeAllOnPleasureChangedListenersabstract fun removeAllOnPleasureChangedListeners(): Unit
  • removeOnExcitementChangedListenerabstract fun removeOnExcitementChangedListener(listener: OnExcitementChangedListener!): Unit
  • removeOnPleasureChangedListenerabstract fun removeOnPleasureChangedListener(listener: OnPleasureChangedListener!): Unit
  • setOnExcitementChangedListenerabstract fun ~setOnExcitementChangedListener~(listener: OnExcitementChangedListener!): Unit (deprecated)
  • setOnPleasureChangedListenerabstract fun ~setOnPleasureChangedListener~(listener: OnPleasureChangedListener!): Unit (deprecated)

EncodedImage

interface EncodedImage

Encoded image.

(Likely represents an image in a particular format (JPEG, etc.) but no further detail provided.)


EncodedImageHandle

interface EncodedImageHandle

Encapsulates an EncodedImage. This object enables sharing EncodedImage data while delaying the copy to the most appropriate time.


EnforceTabletReachability

interface EnforceTabletReachability

Action to limit robot movements in order to ease user interaction with the tablet. The robot will put the tablet at a suitable position then emit the positionReached() signal, and prevent further leg and base movements, while also ensuring that the arm movements do not bring them in front of the tablet.

Since: 1

Types

  • interface Async(Asynchronous version of EnforceTabletReachability functions.)

Functions

async

abstract fun async(): EnforceTabletReachability.Async!

run

Begin the action to enforce tablet reachability.

abstract fun run(): Future<Void>!

Return: Future<Void>! – A future that completes when the robot has adjusted its position or the action is canceled.

setOnPositionReachedListener

abstract fun ~setOnPositionReachedListener~(listener: OnPositionReachedListener!): Unit

(Deprecated) Set a listener for when the tablet position has been reached.

Types (Nested)

  • interface OnPositionReachedListener – Listener for the positionReached signal.

EnforceTabletReachabilityBuilder

open class EnforceTabletReachabilityBuilder

Build a new EnforceTabletReachability.

Functions

  • buildopen fun build(): EnforceTabletReachability!
  • buildAsyncopen fun buildAsync(): Future<EnforceTabletReachability>!
  • withopen static fun with(context: QiContext!): EnforceTabletReachabilityBuilder!

EngageHuman

interface EngageHuman

Action to make the robot look at a human and keep eye contact.

Since: 1

Types

  • interface Async(Asynchronous version of EngageHuman functions.)

Functions

async

abstract fun async(): EngageHuman.Async!

run

Begin engaging with a human (looking at them and maintaining eye contact).

abstract fun run(): Future<Void>!

Return: Future<Void>! – Completes when engagement ends or is canceled.

setPolicy

abstract fun setPolicy(policy: EngagementPolicy!): Unit

Set the engagement (eye contact) policy.
Parameters:

  • policyEngagementPolicy!: The eye contact policy to apply.

EngageHumanBuilder

open class EngageHumanBuilder

Build a new EngageHuman.

Functions

  • buildopen fun build(): EngageHuman!
  • buildAsyncopen fun buildAsync(): Future<EngageHuman>!
  • withopen static fun with(context: QiContext!): EngageHumanBuilder!
  • withPolicyopen fun withPolicy(policy: EngagementPolicy!): EngageHumanBuilder!

EngagementIntentionState

class EngagementIntentionState : QiEnum

Enum containing the engagement intention of the human toward the robot, as perceived.

Since: 1

Enum Values

  • UNKNOWN
  • DISENGAGED
  • ENGAGED

Functions

  • getQiValuefun getQiValue(): Int

EngagementPolicy

class EngagementPolicy : QiEnum

Eye contact policy.

Since: 1

Enum Values

  • NONE
  • SOFT
  • STRONG

Functions

  • getQiValuefun getQiValue(): Int

EnumConverter

interface EnumConverter

Convert a QiEnum to and from a raw AnyObject.


ExcitementState

class ExcitementState : QiEnum

Enum containing the perceived energy in the emotion.

Since: 1

Enum Values

  • LOW
  • NORMAL
  • HIGH

Functions

  • getQiValuefun getQiValue(): Int

ExplorationMap

interface ExplorationMap

Object encapsulating the data needed by the robot to localize itself inside its environment.


ExplorationMapBuilder

open class ExplorationMapBuilder

Build a new ExplorationMap.

Functions

  • buildopen fun build(): ExplorationMap!
  • buildAsyncopen fun buildAsync(): Future<ExplorationMap>!
  • withopen static fun with(context: QiContext!): ExplorationMapBuilder!

FacialExpressions

open class FacialExpressions

Structure containing expression data computed from a human's face.


FlapSensor

interface FlapSensor

Object representing a flap (e.g., a robot's head flap) that may be open or closed.


FlapState

interface FlapState

Description of a flap sensor state.

(Likely provides properties to check if the flap is open or closed, etc.)


Focus

interface Focus

A service tracking the current focus, and guaranteeing that only one client has the focus at the same time. The focus is required for actions to be performed on the robot. This mechanism ensures the focus owner that it can be the only one to control the robot as long as its FocusOwner is not released.

Since: 1

Types

  • interface Async(Asynchronous version of Focus functions.)

Functions

async

abstract fun async(): Focus.Async!

request

abstract fun request(): FocusOwner!

Request the focus for the current application/activity.
Return: FocusOwner! – A FocusOwner object representing the granted focus.


FocusConverter

interface FocusConverter

(No additional description provided.)


FocusOwner

interface FocusOwner

A FocusOwner represents a focus being taken from the focus service. It serves only once, and emits released() whenever the focus is released. A FocusOwner that was released is invalid.

Since: 1

Types

  • interface Async(Asynchronous version of FocusOwner functions.)
  • interface OnReleasedListener – Listener for the released signal.

Functions

  • addOnReleasedListenerabstract fun addOnReleasedListener(listener: OnReleasedListener!): Unit
  • asyncabstract fun async(): FocusOwner.Async!
  • releaseabstract fun release(): Unit – Release the focus and invalidate this FocusOwner.
  • removeAllOnReleasedListenersabstract fun removeAllOnReleasedListeners(): Unit
  • removeOnReleasedListenerabstract fun removeOnReleasedListener(listener: OnReleasedListener!): Unit
  • setOnReleasedListenerabstract fun ~setOnReleasedListener~(listener: OnReleasedListener!): Unit (deprecated)
  • tokenabstract fun token(): String! – The token carried by this FocusOwner (unique identifier for the focus session).

Frame

interface Frame

Object representing the location associated with an object or a person. This location is likely to change over time (for example, when a person moves). Transforms can be computed between two frames, at a given time, to get the relative position and orientation between two objects. If the robot is localized using external sensors, the transform between two frames can be computed with odometry drift compensation.

Since: 1

Types

  • interface Async(Asynchronous version of Frame functions.)

Functions

async

abstract fun async(): Frame.Async!

computeTransform

Compute the transform between this frame and another frame at a given time.

abstract fun computeTransform(destination: Frame!, time: Long!): TransformTime!

Parameters:

  • destinationFrame!: The frame to which to compute the transform.
  • timeLong!: The timestamp at which to compute the transform.
    Return: TransformTime! – The transform from this frame to the destination frame at the given time.

FreeFrame

interface FreeFrame

Object representing a reference frame free to be placed anywhere, that does not move when other frames move. The global position of a free frame can be updated by giving its location at a given time in a given reference frame. In order to compute transforms between frames, one should use the frame() function of a FreeFrame. The FreeFrame will be invalid right after creation until first update.

Since: 1

Types

  • interface Async(Asynchronous version of FreeFrame functions.)

Functions

async

abstract fun async(): FreeFrame.Async!

frame

Get the Frame associated with this FreeFrame.

abstract fun frame(): Frame!

update

Update this free frame’s global location at a given time in a reference frame.

abstract fun update(transform: Transform!, frame: Frame!, timestamp: Long!): Unit

(Parameters similar to AttachedFrame.update; updates the free frame’s position.)


FutureLogger

interface FutureLogger

A Consumer used to automatically log future errors or cancellations.


FutureUtils

object FutureUtils

Utility methods for working with futures.


Gender

class Gender : QiEnum

Enum containing different genders of a human.

Since: 1

Enum Values

  • UNKNOWN
  • MALE
  • FEMALE

Functions

  • getQiValuefun getQiValue(): Int

GeometryConverter

interface GeometryConverter

(No additional description provided.)


GoTo

interface GoTo

Action to make the robot go somewhere. The destination is represented by a target frame. The robot will try to reach safely the 2D location corresponding to the target frame while avoiding obstacles. The robot may look around and follow non-straight paths to choose the safest way towards the target.

Since: 1

Types

  • interface Async
  • interface OnStartedListener – Listener for the started signal.

Functions

  • addOnStartedListenerabstract fun addOnStartedListener(listener: OnStartedListener!): Unit
  • asyncabstract fun async(): GoTo.Async!
  • removeAllOnStartedListenersabstract fun removeAllOnStartedListeners(): Unit
  • removeOnStartedListenerabstract fun removeOnStartedListener(listener: OnStartedListener!): Unit
  • runabstract fun run(): Unit – Run the GoTo on the robot. The started() signal is emitted when the action starts.
  • setOnStartedListenerabstract fun ~setOnStartedListener~(listener: OnStartedListener!): Unit (deprecated)

GoToBuilder

open class GoToBuilder

Build a new GoTo.

Functions

  • buildopen fun build(): GoTo!
  • buildAsyncopen fun buildAsync(): Future<GoTo>!
  • withopen static fun with(context: QiContext!): GoToBuilder!
  • withFrameopen fun withFrame(frame: Frame!): GoToBuilder! – Set the target frame for the GoTo action.
  • withConfigurationopen fun withConfiguration(config: GoToConfig!): GoToBuilder! – Set the configuration for the GoTo action.

GoToConfig

open class GoToConfig

Configuration parameters of a GoTo action. If a parameter is not set by the user, it will default to the action’s standard behavior.

(GoToConfig likely contains properties like path planning policy, orientation policy, max speed, etc.)


Holder

interface Holder

(No description provided; possibly a generic holder interface for resource locking?)


HolderBuilder

open class HolderBuilder

Build a new Holder.


Human

interface Human

Object representing a physical person detected by the robot.

Since: 1

Types

  • interface Async
  • interface OnAttentionLevelChangedListener(Possible listener for attention level? Not shown on site, but likely similar pattern if present.)

Functions

(Functions likely include getters for properties like position (Frame), engagement intention, etc., but details not provided in summary. Possibly: getHeadFrame(), getBodyFrame(), getEmotion(), etc.)*

(No details given in the listing beyond description, so we'll skip specifics.)


HumanAwareness

interface HumanAwareness

Service exposing actions and properties related to human-robot interaction.

Since: 1

Types

  • interface Async

Functions

(Likely includes methods to enable/disable human awareness, get detected humans, etc. Not elaborated in summary text.)


HumanawarenessConverter

interface HumanawarenessConverter

(No additional description provided.)


HumanConverter

interface HumanConverter

(No additional description provided.)


ImageConverter

interface ImageConverter

(No additional description provided.)


IOUtils

object IOUtils

Utility methods for working with raw files and assets.


Knowledge

interface Knowledge

//! Service handling the shared knowledge. (This appears to be a comment in the source; perhaps not intended as documentation text.)

(No additional information provided; likely provides access to a knowledge graph or data store.)


KnowledgeBase

interface KnowledgeBase

Object allowing reading data from given named graphs.


KnowledgeConverter

interface KnowledgeConverter

(No additional description provided.)


KnowledgeSubscriber

interface KnowledgeSubscriber

(No description provided; possibly for subscribing to knowledge graph updates.)


Language

class Language : QiEnum

All the possible languages.

Since: 1

Enum Values

(Likely values correspond to language codes, but not explicitly listed in summary.)


LanguageUtil

object LanguageUtil

Language utility class.


Listen

interface Listen

Action to make the robot listen to and recognize a specific set of phrases pronounced by a user. On recognition success, a ListenResult gives the heard phrase and the matching PhraseSet.

Since: 1

Types

  • interface Async

Functions

  • asyncabstract fun async(): Listen.Async!
  • runabstract fun run(): ListenResult! – Run the listening action; returns a result containing the heard phrase and matching phrase set (blocking call).
  • runAsyncabstract fun runAsync(): Future<ListenResult>! – Run the listening action asynchronously.
  • setPhraseSetsabstract fun setPhraseSets(phraseSets: List<PhraseSet!>!): Unit – Specify the set of PhraseSets that the robot should listen for.

(And possibly functions to add listeners for events like sound localization, etc, if any.)


ListenBuilder

open class ListenBuilder

Build a new Listen.

Functions

  • buildopen fun build(): Listen!
  • buildAsyncopen fun buildAsync(): Future<Listen>!
  • withopen static fun with(context: QiContext!): ListenBuilder!
  • withPhraseSetsopen fun withPhraseSets(phraseSets: List<PhraseSet!>!): ListenBuilder!

ListenOptions

open class ListenOptions

Optional parameters for the configuration of a Listen action.


ListenResult

open class ListenResult

The heard phrase along with the matching phrase set.

Properties

  • heardPhrase: Phrase – The phrase that was recognized.
  • matchedPhraseSet: PhraseSet – The PhraseSet in which the phrase was found.

Locale

class Locale

A locale.

(Likely identifies language and region; no further description in summary.)


LocaleConverter

interface LocaleConverter

(No additional description provided.)


LocalizationStatus

class LocalizationStatus : QiEnum

Localization process status.

Since: 1

Enum Values

  • SCANNING
  • LOCALIZED
  • LOST

Functions

  • getQiValuefun getQiValue(): Int

Localize

interface Localize

Action to make the robot localize itself in a map previously built by a LocalizeAndMap action. Only one LocalizeAndMap or Localize action can run at a time. When run, the robot first executes a rotating base and head scan. During this initial scan, the action needs all the robot’s resources to run, and the action status is Scanning. After this scan, the robot is localized inside the map, and its position and orientation relative to the map origin can be retrieved at any time by calling Actuation.robotFrame().computeTransform(Mapping.mapFrame()). While the action is running, the robot may autonomously look around to confirm its location.

Since: 3

Types

  • interface Async
  • interface OnStartedListener – Listener for the started signal.
  • interface OnStatusChangedListener – Listener for the status property changed event.

Functions

  • addOnStartedListenerabstract fun addOnStartedListener(listener: OnStartedListener!): Unit
  • addOnStatusChangedListenerabstract fun addOnStatusChangedListener(listener: OnStatusChangedListener!): Unit – Add a listener for status changes.
  • asyncabstract fun async(): Localize.Async!
  • getStatusabstract fun getStatus(): LocalizationStatus! – Exposes the current localization status.
  • removeAllOnStartedListenersabstract fun removeAllOnStartedListeners(): Unit
  • removeAllOnStatusChangedListenersabstract fun removeAllOnStatusChangedListeners(): Unit
  • removeOnStartedListenerabstract fun removeOnStartedListener(listener: OnStartedListener!): Unit
  • removeOnStatusChangedListenerabstract fun removeOnStatusChangedListener(listener: OnStatusChangedListener!): Unit
  • runabstract fun run(): Unit – Start the localization process. It will run until the future returned by run() is canceled.
  • runWithLocalizationHintabstract fun runWithLocalizationHint(hint: Transform!): Unit – Start the localization process with a hint about the robot’s current location relative to the map origin. The process will run until canceled.
  • setOnStartedListenerabstract fun ~setOnStartedListener~(listener: OnStartedListener!): Unit (deprecated)
  • setOnStatusChangedListenerabstract fun ~setOnStatusChangedListener~(listener: OnStatusChangedListener!): Unit (deprecated)

LocalizeAndMap

interface LocalizeAndMap

Action to make the robot explore an unknown environment while localizing itself and building a representation of this environment (known as an exploration map). Only one action among LocalizeAndMap or Localize can run at a time. When run, the robot first executes a rotating base and head scan. During this initial scan, the action needs all the robot’s resources to run and the action status is Scanning. After the scan, it is the developer’s responsibility to make the robot move and to stop the mapping when done. The developer thus has full control over the mapped area. While the action is running, the robot may autonomously look around to confirm its location. A given environment needs to be mapped once and only once. The result of this mapping can be dumped to an ExplorationMap object. Afterwards, the ExplorationMap object can be used to create a Localize action that will enable the robot to keep track of its position relative to the map.

Since: 3

Types

  • interface Async
  • interface OnStartedListener

Functions

  • addOnStartedListenerabstract fun addOnStartedListener(listener: OnStartedListener!): Unit
  • asyncabstract fun async(): LocalizeAndMap.Async!
  • removeAllOnStartedListenersabstract fun removeAllOnStartedListeners(): Unit
  • removeOnStartedListenerabstract fun removeOnStartedListener(listener: OnStartedListener!): Unit
  • runabstract fun run(): Unit – Start the mapping process (scanning and exploring).
  • setOnStartedListenerabstract fun ~setOnStartedListener~(listener: OnStartedListener!): Unit (deprecated)

LocalizeAndMapBuilder

open class LocalizeAndMapBuilder

Build a new LocalizeAndMap.

Functions

  • buildopen fun build(): LocalizeAndMap!
  • buildAsyncopen fun buildAsync(): Future<LocalizeAndMap>!
  • withopen static fun with(context: QiContext!): LocalizeAndMapBuilder!

LocalizeBuilder

open class LocalizeBuilder

Build a new Localize.

Functions

  • buildopen fun build(): Localize!
  • buildAsyncopen fun buildAsync(): Future<Localize>!
  • withopen static fun with(context: QiContext!): LocalizeBuilder!

LocalizedString

open class LocalizedString

Struct representing a string with a language (a locale-specific string).


LookAt

interface LookAt

Action to look at and track a target. The target is represented by a frame, and the robot will look at the origin of that frame. In practice, the action will make the robot move so that the x-axis of the gaze frame aligns with the origin of the target frame. The robot will track the target until the action is canceled.

Since: 1

Types

  • interface Async

Functions

async

abstract fun async(): LookAt.Async!

run

Start looking at the target frame.

abstract fun run(): Unit

(No return, runs until canceled.)

setPolicy

abstract fun setPolicy(policy: LookAtMovementPolicy!): Unit

Set the strategy for the LookAt movement.
Parameters:

  • policyLookAtMovementPolicy!: The movement policy to use (e.g., head only, whole body, etc.).

LookAtBuilder

open class LookAtBuilder

Build a new LookAt.

Functions

  • buildopen fun build(): LookAt!
  • buildAsyncopen fun buildAsync(): Future<LookAt>!
  • withopen static fun with(context: QiContext!): LookAtBuilder!
  • withFrameopen fun withFrame(frame: Frame!): LookAtBuilder! – Set the target frame to look at.
  • withPolicyopen fun withPolicy(policy: LookAtMovementPolicy!): LookAtBuilder! – Set the movement policy.

LookAtMovementPolicy

class LookAtMovementPolicy : QiEnum

Strategies to look at a target.

Since: 1

Enum Values

  • HEAD_ONLY
  • WHOLE_BODY

Functions

  • getQiValuefun getQiValue(): Int

Mapping

interface Mapping

A service providing the mapping of the local area.

Since: 1

Types

  • interface Async

Functions

async

abstract fun async(): Mapping.Async!

mapFrame

Get the Frame representing the origin of the map.

abstract fun mapFrame(): Frame!

Return: Frame! – The map frame (origin of the built map).


MapTopGraphicalRepresentation

open class MapTopGraphicalRepresentation

Used to display map shape on a UI. Scale, x, y and theta can be used to switch from world coordinates (relative to mapFrame) to image coordinates in pixels using the following formulas:

//!

x_map = scale * (cos(theta) * x_img + sin(theta) * y_img) + x  
y_map = scale * (sin(theta) * x_img - cos(theta) * y_img) + y  

x_img = (1/scale) * (cos(theta) * (x_map - x) + sin(theta) * (y_map - y))  
y_img = (1/scale) * (sin(theta) * (x_map - x) - cos(theta) * (y_map - y))

Map frame (meters) vs image pixel coordinates:

  • ^ y_map (meters)
  • --> x_img (pixels)
  • (with axes illustration)
    //!

(The above formulas illustrate how to convert between map coordinates and image coordinates using the provided scale and offsets.)


Node

open class Node

Struct representing a node in the knowledge graph. A Node holds an object which can be a ResourceNode or a LiteralNode. Every node other than ResourceNode is considered a literal. Literal nodes can handle the following types:

  • str
  • LocalizedString
  • float
  • int
  • TimeString
  • DateTimeString
  • DateString

OrientationPolicy

class OrientationPolicy : QiEnum

Policy defining orientation control of a given frame with respect to a target frame.

Since: 1

Enum Values

  • FREE_ORIENTATION
  • MATCH_ORIENTATION

Functions

  • getQiValuefun getQiValue(): Int

PathPlanningPolicy

class PathPlanningPolicy : QiEnum

Path planning strategies to go to a target.

Since: 1

Enum Values

  • DEFAULT (e.g., GetAroundObstacles)
  • SHORTCUT (if available)

(Exact enum values not listed on site, but likely includes ones referenced in GoToConfig defaults.)

Functions

  • getQiValuefun getQiValue(): Int

Phrase

open class Phrase

Structure containing a chunk of text intended to be said or listened to by the robot.


PhraseSet

open class PhraseSet

Object representing a set of phrases, used to group phrases considered as synonyms. Example: "yes", "yep", "indeed", ...


PhraseSetBuilder

open class PhraseSetBuilder

Build a new PhraseSet.

Functions

  • buildopen fun build(): PhraseSet!
  • buildAsyncopen fun buildAsync(): Future<PhraseSet>!
  • withopen static fun with(context: QiContext!): PhraseSetBuilder!
  • withTextsopen fun withTexts(texts: List<String!>!): PhraseSetBuilder! – Set the list of phrases.

PhraseSetUtil

object PhraseSetUtil

PhraseSet utility class.


PleasureState

class PleasureState : QiEnum

Enum containing the possible states reached on the pleasure-displeasure scale.

Since: 1

Enum Values

  • UNKNOWN
  • NEGATIVE
  • NEUTRAL
  • POSITIVE

Functions

  • getQiValuefun getQiValue(): Int

Power

interface Power

The Power service aggregates information related to robot power management.

Since: 1

Types

  • interface Async

Functions

(Likely includes methods to check battery status, power source, etc., not detailed here.)


PowerConverter

interface PowerConverter

(No additional description provided.)


Qi

interface Qi

(No description provided; possibly a core interface? No content shown on site.)


QiChatbot

interface QiChatbot

Chatbot that can be used to make the robot able to chat with a human. The interaction will be based on the content given in the QiChatbot topics.

Since: 1

Types

  • interface Async
  • interface OnBookmarkReachedListener – Listener for when a bookmark is reached.
  • interface OnEndedListener – Listener for when the chatbot has finished an utterance or conversation.
  • interface OnFallbackReplyFoundListener – Listener for when a fallback reply is found.
  • interface OnNoReplyFoundListener – Listener for when no reply is found.
  • interface OnNormalReplyFoundListener – Listener for when a normal reply is found.
  • interface OnPausedListener – Listener for when the chatbot is paused.
  • interface OnResumedListener – Listener for when the chatbot is resumed.

Functions

(QiChatbot likely has functions to load topics, set executors, variable values, etc., as well as add listeners for the above events. Due to the complexity, and since specifics aren't given in the summary text, we'll skip detailed listing. It definitely includes methods like addOnBookmarkReachedListener, goToBookmark, loadTopic, removeAllListeners, etc.)


QiChatbotBuilder

open class QiChatbotBuilder

Build a new QiChatbot.

Functions

  • buildopen fun build(): QiChatbot!
  • buildAsyncopen fun buildAsync(): Future<QiChatbot>!
  • withopen static fun with(context: QiContext!): QiChatbotBuilder!
  • withTopicopen fun withTopic(topic: Topic!): QiChatbotBuilder!
  • withTopicsopen fun withTopics(topics: List<Topic!>!): QiChatbotBuilder!
  • withLocaleopen fun withLocale(locale: Locale!): QiChatbotBuilder! (if needed to set language)

QiChatExecutor

interface QiChatExecutor

Object representing a user-defined action to execute synchronously during an utterance in a QiChatbot.

(Usually implemented by developers to perform custom actions when invoked from a QiChat topic.)


QiChatVariable

interface QiChatVariable

Object representing a variable in a QiChat topic.

(Provides methods to read/write the variable’s value.)


QiContext

open class QiContext : ContextWrapper, Callback

(The QiContext class is the main entry point to QiSDK services within an Android context.)

Functions

  • convertopen fun <T : Any!> convert(o: Any!, type: Type!): TConvert an object of one type to another if possible (for internal use, rarely used by developers).
  • getActuationopen fun getActuation(): Actuation! – Return the robot “Actuation” service.
  • getActuationAsyncopen fun getActuationAsync(): Future<Actuation>!
  • getAutonomousAbilitiesopen fun getAutonomousAbilities(): AutonomousAbilities! – Return the robot “AutonomousAbilities” service.
  • getAutonomousAbilitiesAsyncopen fun getAutonomousAbilitiesAsync(): Future<AutonomousAbilities>!
  • getCameraopen fun getCamera(): Camera! – Return the robot “Camera” service.
  • getCameraAsyncopen fun getCameraAsync(): Future<Camera>!
  • getContextFactoryopen fun getContextFactory(): RobotContextFactory! – Return the robot “ContextFactory” service.
  • getContextFactoryAsyncopen fun getContextFactoryAsync(): Future<RobotContextFactory>!
  • getConversationopen fun getConversation(): Conversation! – Return the robot “Conversation” service.
  • getConversationAsyncopen fun getConversationAsync(): Future<Conversation>!
  • getFocusopen fun getFocus(): Focus! – Return the robot “Focus” service.
  • getFocusAsyncopen fun getFocusAsync(): Future<Focus>!
  • getHumanAwarenessopen fun getHumanAwareness(): HumanAwareness! – Return the robot “HumanAwareness” service.
  • getHumanAwarenessAsyncopen fun getHumanAwarenessAsync(): Future<HumanAwareness>!
  • getMappingopen fun getMapping(): Mapping! – Return the robot “Mapping” service.
  • getMappingAsyncopen fun getMappingAsync(): Future<Mapping>!
  • getPoweropen fun getPower(): Power! – Return the robot “Power” service.
  • getPowerAsyncopen fun getPowerAsync(): Future<Power>!
  • getTouchopen fun getTouch(): Touch! – Return the robot “Touch” service.
  • getTouchAsyncopen fun getTouchAsync(): Future<Touch>!
  • (Other service getters like getAudio, getChat, etc., if any, would follow the same pattern.)

(Note: QiContext inherits Android’s ContextWrapper and likely implements some Callback interface for asynchronous operations, but those details are outside QiSDK scope.)


QiDisconnectionListener

interface QiDisconnectionListener

Session disconnection listener.

(Likely has a method onRobotDisconnected() or similar to be overridden.)


QiEnum

interface QiEnum

(A marker interface for enumerations in QiSDK that map to int values.)

Functions

  • getQiValueabstract fun getQiValue(): Int

Inheritors

The following classes implement QiEnum (each representing a specific enumerated type in the SDK):

  • AttentionState – Enum containing the possible attention states of the human (where the human is looking). class AttentionState : QiEnum
  • AutonomousReactionImportance – Additional info on the importance of a suggested ChatbotReaction. class AutonomousReactionImportance : QiEnum
  • AutonomousReactionValidity – Describes the validity of a suggested ChatbotReaction. class AutonomousReactionValidity : QiEnum
  • BodyLanguageOption – Body language policy. class BodyLanguageOption : QiEnum
  • ChatbotReactionHandlingStatus – Status of a ChatbotReaction (in a Chat action). class ChatbotReactionHandlingStatus : QiEnum
  • DegreeOfFreedom – A degree of freedom. class DegreeOfFreedom : QiEnum
  • EngagementIntentionState – Engagement intention of the human. class EngagementIntentionState : QiEnum
  • EngagementPolicy – Eye contact policy. class EngagementPolicy : QiEnum
  • ExcitementState – Perceived energy in emotion. class ExcitementState : QiEnum
  • Gender – Different genders of a human. class Gender : QiEnum
  • Language – Possible languages. class Language : QiEnum
  • LocalizationStatus – Localization process status. class LocalizationStatus : QiEnum
  • LookAtMovementPolicy – Strategies to look at a target. class LookAtMovementPolicy : QiEnum
  • OrientationPolicy – Orientation control policy. class OrientationPolicy : QiEnum
  • PathPlanningPolicy – Path planning strategies. class PathPlanningPolicy : QiEnum
  • PleasureState – States on pleasure-displeasure scale. class PleasureState : QiEnum
  • Region – All possible regions. class Region : QiEnum
  • ReplyPriority – Priority of a Chatbot reply. class ReplyPriority : QiEnum
  • SmileState – Possible smiling states of the human. class SmileState : QiEnum

(Each of the above classes is documented separately in this collection.)


QiRobot

open class QiRobot : Callback

Represents a connection to a robot.

Functions

  • onRobotAbsentopen fun onRobotAbsent(): Unit – Callback when the robot becomes absent.
  • onRobotLostopen fun onRobotLost(): Unit – Callback when the robot connection is lost.
  • onRobotReadyopen fun onRobotReady(session: Session!): Unit – Callback when the robot is ready (session connected), providing the Session.

(QiRobot likely implements some interface with these callbacks for robot connection events.)


QiSDK

open class QiSDK

Helper to initialize Qi SDK.

Constructors

  • QiSDK() – Helper to initialize Qi SDK.

Properties

  • VERSIONstatic val VERSION: String!(SDK version string.)

Functions

getSerializer

open static fun getSerializer(): QiSerializer!

(Returns an instance of QiSerializer for custom object serialization support.)

init

open static fun init(application: Application!): Unit

Initialize the QiSDK with the given Android application context.
Parameters:

  • applicationApplication!: The Android Application instance.

register

open static fun register(activity: Activity!, callbacks: RobotLifecycleCallbacks!): Unit

Add a new RobotLifecycleCallbacks for the default QiRobot.
Parameters:

  • activityActivity!: The Android Activity to associate with the callbacks.
  • callbacksRobotLifecycleCallbacks!: The callback implementation to register.

unregister

open static fun unregister(activity: Activity!, callbacks: RobotLifecycleCallbacks!): Unit

Remove a RobotLifecycleCallbacks for the default QiRobot.
Parameters:

  • activityActivity!: The Activity associated with the callbacks.
  • callbacksRobotLifecycleCallbacks!: The callback implementation to remove.
open static fun unregister(activity: Activity!): Unit

Unregister all RobotLifecycleCallbacks for the default QiRobot associated with the given activity.
Parameters:

  • activityActivity!: The Activity for which all callbacks should be removed.

QiThreadPool

class QiThreadPool

Shared thread pool. (Mainly for internal purpose.)

Functions

  • executestatic fun <V : Any!> execute(callable: Callable<V>!): Future<V>! – Execute a callable on the thread pool.
    Parameters:
    • callableCallable<V>!: The task to execute.
      Return: Future<V>! – A future representing the pending result.
  • schedulestatic fun <V : Any!> schedule(callable: Callable<V>!, delay: Long, timeUnit: TimeUnit!): Future<V>! – Execute a callable with a delay on the thread pool.
    Parameters:
    • callableCallable<V>!: The task to execute.
    • delayLong: The delay before execution.
    • timeUnitTimeUnit!: The time unit of the delay.
      Return: Future<V>! – A future representing the scheduled task.

Quaternion

class Quaternion

Quaternion representation of rotations. See: Quaternion (Wikipedia) for more information.

(Likely has components x, y, z, w and possibly methods to convert to/from Euler angles or multiply quaternions.)


Region

class Region : QiEnum

All the possible regions.

Since: 1

Enum Values

(Likely values like NORTH_AMERICA, EUROPE, etc., but not listed in summary.)


RegionUtil

object RegionUtil

Region utility class.


ReplyPriority

class ReplyPriority : QiEnum

Additional information on the priority of a Chatbot reply.

Since: 1

Enum Values

  • NORMAL
  • HIGH
  • CRITICAL

(Hypothetical values, not explicitly given but likely.)

Functions

  • getQiValuefun getQiValue(): Int

ReplyReaction

interface ReplyReaction

A Chatbot reaction to a human utterance.

(Likely similar to ChatbotReaction or a subtype thereof, representing a reply from the bot.)


Requirement

interface Requirement

A requirement creates and holds a Future, which represents the value once satisfied.

(This likely relates to the QiSDK Requirement framework for dynamic condition checking. Not further detailed on site.)


ResourceNode

open class ResourceNode

Struct representing a resource node which has a unique URL among the triple database.

(In the knowledge graph, a ResourceNode represents an entity with a unique identifier (URI).)


RobotContext

interface RobotContext

The context object gathers together all the handles and tokens that authorize an action to be effectively executed on a robot.

(Used internally to ensure an action has permission to run on the robot, typically obtained via QiContext and not manipulated directly by developers.)


RobotContextFactory

interface RobotContextFactory

The context factory is a service providing a RobotContext.

(Likely used to create RobotContext instances for actions.)


RobotLifecycleCallbacks

interface RobotLifecycleCallbacks

Robot lifecycle callback interface.

(The user should implement this to receive QiSDK events in an Activity lifecycle: onRobotFocusGained, onRobotFocusLost, etc.)

(Typically contains methods like onRobotFocusGained(QiContext), onRobotFocusLost(), onRobotFocusRefused(String) as per QiSDK documentation.)


RunnableAutonomousReaction

abstract class RunnableAutonomousReaction

Alternative AutonomousReaction that allows the implementation of its execution by subclasses instead of delegating it to a ChatbotReaction.


RunnableReplyReaction

abstract class RunnableReplyReaction

Alternative ReplyReaction that allows the implementation of its execution by subclasses instead of delegating it to a ChatbotReaction.


Say

interface Say

Action to make the robot say a phrase.

Since: 1

Types

  • interface Async

Functions

async

abstract fun async(): Say.Async!

run

Start saying the phrase.

abstract fun run(): Unit

(No return; the action will complete when the speech is done.)

setLocale

abstract fun setLocale(locale: Locale!): Unit

Set the locale (language) for this Say action.
Parameters:

  • localeLocale!: The locale in which to speak the phrase.

SayBuilder

open class SayBuilder

Build a new Say.

Functions

  • buildopen fun build(): Say!
  • buildAsyncopen fun buildAsync(): Future<Say>!
  • withopen static fun with(context: QiContext!): SayBuilder!
  • withPhraseopen fun withPhrase(phrase: Phrase!): SayBuilder! – Set the phrase to say.
  • withLocaleopen fun withLocale(locale: Locale!): SayBuilder! – Set the locale for the phrase.

ServiceRequirement

interface ServiceRequirement

(No description; likely an internal requirement that a given service is accessible.)


ServiceUnavailableException

open class ServiceUnavailableException : RuntimeException

Constructors

  • ServiceUnavailableException(serviceName: String!) – Thrown when a required service is not available. (The parameter serviceName indicates which service was unavailable.)

SessionRequirement

interface SessionRequirement

(No description; likely represents requirement for an active session.)


SmileState

class SmileState : QiEnum

Enum containing the possible smiling states of the human. This feature is only based on facial expression.

Since: 1

Enum Values

  • UNKNOWN
  • NOT_SMILING
  • SMILING

Functions

  • getQiValuefun getQiValue(): Int

SpeechEngine

interface SpeechEngine

Factory to create Say actions.

(Likely used internally when creating Say from text, not commonly used directly by developers.)


StandardAutonomousReaction

class StandardAutonomousReaction

Default implementation of AutonomousReaction.


StandardReplyReaction

class StandardReplyReaction

Default implementation of ReplyReaction.


StreamableBuffer

interface StreamableBuffer

A buffer that can be read incrementally (little by little).

(Used for streaming data, perhaps audio or video frames.)


StreamablebufferConverter

interface StreamableBufferConverter

(No additional description provided.)


StreamableBufferFactory

interface StreamableBufferFactory

Factory of StreamableBuffer.

(Probably used to create StreamableBuffer instances for given data streams.)


TakePicture

interface TakePicture

Action to take pictures on the robot.

Since: 1

Types

  • interface Async

Functions

async

abstract fun async(): TakePicture.Async!

run

Take a picture.

abstract fun run(): EncodedImage!

Return: EncodedImage! – The captured image.

setResolution

abstract fun setResolution(resolution: Int): Unit

(If applicable, set the camera resolution for pictures.)

(Detailed parameter info not given, assume resolution is an index or constant.)


TakePictureBuilder

open class TakePictureBuilder

Build a new TakePicture.

Functions

  • buildopen fun build(): TakePicture!
  • buildAsyncopen fun buildAsync(): Future<TakePicture>!
  • withopen static fun with(context: QiContext!): TakePictureBuilder!

TimestampedImage

interface TimestampedImage

Timestamped encoded image.

(Likely an image with an associated timestamp property.)


TimestampedImageHandle

interface TimestampedImageHandle

Associates a timestamp with an EncodedImageHandle.


TimeString

open class TimeString

Struct representing a time as a string.


Topic

open class Topic

Object representing a topic (for QiChat).

(Contains dialog content such as rules, replies, etc., loaded from .top files.)


TopicBuilder

open class TopicBuilder

Build a new Topic.

Functions

  • buildopen fun build(): Topic!
  • buildAsyncopen fun buildAsync(): Future<Topic>!
  • withAssetopen static fun with(context: QiContext!, assetName: String!): TopicBuilder! – Initialize a TopicBuilder with a topic asset name (from assets).
  • withResourceopen static fun with(context: QiContext!, resourceId: Int): TopicBuilder! – Initialize with a raw resource containing a topic.

TopicStatus

open class TopicStatus

The current state of a topic in a Discuss action.

(Likely indicates if topic is started, stopped, etc.)


Touch

interface Touch

The Touch service provides objects to access and subscribe to touch sensor data.

Since: 1

Types

  • interface Async

Functions

async

abstract fun async(): Touch.Async!

getSensors

abstract fun getSensors(): List<TouchSensor>!

Get the list of touch sensors on the robot.
Return: List<TouchSensor>! – All available touch sensors.

getSensor

abstract fun getSensor(name: String!): TouchSensor?

Get a specific touch sensor by name.
Parameters:

  • nameString!: The name of the touch sensor (e.g., "Head/Touch").
    Return: TouchSensor? – The sensor, or null if not found.

TouchConverter

interface TouchConverter

(No additional description provided.)


TouchSensor

interface TouchSensor

Object representing a sensor that detects when the robot is touched.

Since: 1

Types

  • interface Async
  • interface OnStateChangedListener – Listener for touch state changes.

Functions

  • addOnStateChangedListenerabstract fun addOnStateChangedListener(listener: OnStateChangedListener!): Unit
  • asyncabstract fun async(): TouchSensor.Async!
  • getFrameabstract fun getFrame(): Frame! – Exposes the frame of the touch sensor on the robot.
  • getNameabstract fun getName(): String! – Exposes the name of the sensor.
  • getStateabstract fun getState(): TouchState! – Exposes the current state (touched or not).
  • removeAllOnStateChangedListenersabstract fun removeAllOnStateChangedListeners(): Unit
  • removeOnStateChangedListenerabstract fun removeOnStateChangedListener(listener: OnStateChangedListener!): Unit
  • setOnStateChangedListenerabstract fun ~setOnStateChangedListener~(listener: OnStateChangedListener!): Unit (deprecated)

TouchState

interface TouchState

Description of a touch sensor state.

(Likely an interface or class with information whether the sensor is touched, possibly pressure, etc.)


Transform

open class Transform

A homogeneous transformation matrix. See: Transformation matrix (Wikipedia) for more information. It is represented by a 3D vector and a quaternion.

(Transform likely has translation (Vector3) and rotation (Quaternion) components.)


TransformBuilder

open class TransformBuilder

Build a new Transform.

Functions

  • createopen static fun create(x: Double, y: Double, z: Double, q: Quaternion!): Transform! – Create a Transform from components.
    (Method name inferred; not explicitly listed on site, but likely exists.)

TransformTime

open class TransformTime

A transform associated with a timestamp.

(Contains a Transform and the time at which it was valid.)


Triple

open class Triple

Struct representing a triple (subject, predicate, object) in a knowledge graph. Subject and predicate are always resources; the object can be a resource or a literal (here encapsulated in a Node).


Vector3

open class Vector3

A generic 3D vector.

(Likely has x, y, z components and possibly utility methods like norm, etc.)