Automation services for business process like robotic process automation (RPA) have recently attracted increasing attention. These automation services typically require advanced information technology literacy when creating automation programs. Thus, it is difficult for non-skilled users to make use of the services. A smart interface that can create and execute an automated program specified in a natural language (NL) description would be useful. This smart interface is one of the intelligent process automation (IPA) realizations. We focus on a natural language processing method that plays an important role in this interface: semantic parsing. Semantic parsing consists of three components: (a) a natural language description, (b) a machine-executable meaning representation (MEMR), and (c) a parser that converts (a) to (b). MEMR can be a formal expression that follows a specific grammar. Workflows targeted by automation systems represented by RPA are generally complex workflows composed of a combination of multiple tasks. In studies on semantic parsing, (1) the formal expression and the formal grammar for expressing such workflows have both not been sufficiently examined. As a result, (2) there was no dataset that had a pairing of a complex workflow with its corresponding NL description.
Formal Expression and Grammar
With respect to the expression and the grammar, we focus on two problems: an expression unit, and a formal grammar for complex workflows.
Several types of expressions exist depending on the abstraction level of the expression. Many studies have been conducted on parsing NL description to code, such as transition-based neural semantic parsing like TranX [12, 13]. The field of code generation typically uses expressions with low abstraction: code itself. Therefore, the expression becomes bloated when attempting to express a complex workflow that is composed of hundreds of lines of code. Consider an expression with high abstraction that is much closer to human language, as exemplified by calling macros, APIs, and modules, 111This expression can be considered as a “no-code/low-code” expression that has recently been refocused in RPA. such as “Insert Rows Into a Google Spreadsheet”; this is a straightforward approach to avoiding the bloating problem. An expression with high abstraction is typically used in a trigger action program (TAP) format that performs a certain action when a certain trigger occurs. A notable TAP dataset is the IFTTT dataset [10, 8]. However, the workflow targeted by the IFTTT dataset itself is too simple to use for complex workflows.
Although it is necessary to express a series of processes as a whole in one shot, collecting such datasets is inherently difficult. We focus on the form of the TAP chain in formulating a formal grammar. TAP can be considered a type of MEMR. The advantage of TAP is that it expresses the essence of being event-driven in a straightforward manner, so it has high versatility. Moreover, there are datasets and existing studies on TAP. Complex workflows are composed of process chains like “If this triggered, then do this action and do this action separately, and finally do this action.” On the other hand, a single TAP only ends with “If this triggered, then do this action,” ((a) in Figure 1) but we can assume that performing this action causes another trigger ((b) in Figure 1). Thus, we can express a complex workflow by repeating TAP chaining ((c) in Figure 1). We call this a “TAP chain” and incorporate it into the grammar formulation222It seems important to consider the concept of type or category of a MEMR. The type may include semantic content indicating the close connectivity of each task..
This facilitates conversion to complex workflows through relatively simple MEMRs.333In this paper, we use the TAP chain-based formal expression for workflow for the reason already described, but other formal expressions for workflow can be also considered. We reserve this issue for future work. Especially, this enables us to apply the already-mentioned transition-based neural semantic parsing to a workflow intricately combined with such MEMRs. Transition-based neural semantic parsing does not generate code directly, but generates a sequence of grammars (in the form of context-free grammars)444TranX mainly targets a low abstraction level. The main focus of TranX for applying the parsing is to ensure that the parsing results are legal in the context of the Python abstract grammar.. To the best of our knowledge, no such study has been conducted thus far because of the lack of an adequate grammar for such workflows and parsing.
Complex Workflow Dataset
In our creation of complex workflow dataset, we use the IFTTT dataset that contains TAP programs with their corresponding NL descriptions. From the IFTTT dataset, we manually extract the relationships where one TAP triggers another TAP, and generate the TAP chain rules. After that, a TAP chain rule is randomly applied to generate a complex workflow. Since it is applied randomly, the usefulness of the generated workflow is manually annotated. NL description corresponding to the created workflow is generated by fusing NL description of each TAP that constitutes the workflow. Two approaches are considered: rule-based generation, and sentence fusion generation (one approach in abstractive summarizations). The generated NL descriptions are manually annotated especially with respect to test data.
The main contribution of this study is the definition of a new grammar for semantic parsing to complex workflows. In addition, an approach to creating the dataset is proposed based on this grammar. In the following, we concisely explain the related work, then propose workflow patterns grammar (WPG) and dataset creation, then concisely refer to the model, and finally conclude this short paper.
The process of converting NL into MEMR is known as semantic parsing. A typical example of semantic parsing is SQL generation for database queries. A semantic parser translates the sentence “How many people live in Seattle?” to “SELECT Population FROM CityData where City==Seattle”. Then the SQL query is executed to obtain the correct answer, “620,778” . Another typical example of semantic parsing is code generation where a single function declaration or class declaration is viewed as an MEMR . Rather than generating code directly, transition-based neural semantic parsing like TranX generates a sequence of ASDL grammars that are sequentially expanded and applied to generate MEMRs [12, 13]. However, these parsing models mainly target a low abstraction level of formal expression.
A typical web service of TAP is the IFTTT [10, 8]. As an example, the NL description “youtube upload to blogger new post” is converted into an MEMR with a high abstraction level TRIGGER “YouTube.New_public_video_uploaded_by_you” and ACTION “Blogger.Create_a_post.” A study of simple grammar-based semantic parsing without neural model has been conducted . Additionally, a method using neural semantic parsing has been proposed [1, 7, 3, 4]. Furthermore, a dialogue model 
and a reinforcement learning dialogue model have also been proposed. However, a target workflow of IFTTT itself is too simple to be able to directly automate complex real-world workflows.
Workflow Patterns Grammar
In this paper, we define a new grammar for semantic parsing to complex workflows (Table 1). This enables us to apply the transition-based neural semantic parsing to a workflow intricately combined with MEMRs. The specific notation follows Yin2018b Yin2018b, which mainly refer to the Python ASDL grammar: The notation “?” represents the optional type, which can have one value or a null value, and the notation “*” indicates the sequential type, which can have two or more values.
|wpg||Sequence(func? trigger, func action)|
|Parallel_Split(func? trigger, func* action)||(2)|
|func||Call(type channel, wpg? next)||(3)|
|type||Type_A Type_B Type_C …||(4)|
The first line (1) in the table indicates the start point of the workflow generation. The stmt type evokes a constructor with the wpg type argument called “pattern.” This expansion delivers the start point of the workflow.
In workflows like the ones in office workplaces, there are patterns that repeatedly occur. Russell2016 Russell2016 introduced five basic patterns to capture the elementary aspects of the flow: sequence, parallel split, synchronization, exclusive choice, and simple merge. In this study, we consider two patterns, sequence and parallel split, to maintain the tree structure and simplicity of the MEMRs. These constructors are shown in line (2) of Table 1. The first pattern expands wpg to a sequence pattern: that is, it evokes the sequence pattern constructor. The second pattern expands wpg to a parallel split pattern: that is, it evokes the parallel split pattern constructor.
WPG expression in this paper also considers the flow of processed data. For example, the “Send Text to Me” function has no return value and therefore no function to connect to the next. Since the “Archive Text in Spread Sheet” function must receive text data, it cannot follow a “Send Text to Me” function having no return. If the processed output data from functions are different, it is straightforward to handle them in different branches. Also, transition-based neural semantic parsing learns the sequence of grammar expansions corresponding to NL descriptions. In this learning, the parallel split is expected to work as a signal token to decide whether subsequent flow should branch, based on its previous trigger or action and NL descriptions.
As already mentioned, a complex workflow is generated from a simple MEMR by considering a TAP chain, as presented in line (2) in Table 1. The sequence pattern’s arguments are the func? type argument called “trigger” and the func type argument called “action.” Because this pattern is simple, such that when the “trigger” is evoked, the “action” is activated, we consider this pattern to have two arguments. The notation “?” represents the optional type, which can have one value or a null value. When two sequence patterns are connected (for example, Sequence(Function A, Function B) and Sequence(Function B, Function C) are connected sequentially), the action in the first sequence pattern is the same as the trigger in the second sequence pattern. Thus, the function is duplicated. We expand the second sequence pattern as Sequence(null, Function C).
The parallel split pattern’s arguments are the func? type argument called “trigger” and the func* type argument called “action.” This pattern splits the preceding function’s result into two or more functions.
func is the type that evokes the Call constructor, as presented in line (3) in Table 1. The constructor controls the next task and the next workflow to be executed after the task is completed. The concrete task is derived through expansion from the type argument called “channel”. On the other hand, if the task is followed by another task, the constructor should get a value at the type argument called “next”. type is expanded to a concrete macro method class that has concrete functions belonging to the class. This expansion is presented in line (4) in Table 1.
Workflow Represented in Abstract Syntax Tree (AST)
Consider a specific example of a workflow represented in AST (WAST). For example, consider a workflow, as depicted in Figure 2555In this paper, to make the discussion easier we use a workflow that is more complex than TAP but still relatively simple. The proposed framework is applicable to versions of workflow that are more complex than this simple example..
When WPG is applied to this workflow, the WAST is that in Figure 3.
The WPG grammars applied sequentially are depicted in Table 2.
|stmt root||Workflow(wpg pattern)|
|wpg pattern||Sequence(func? trigger, func action)|
|func? trigger||Call(type channel, wpg? next)|
|wpg? next||StopExpnsn(close the frontier field)|
|func action||Call(type channel, wpg? next_wpg)|
|wpg? next||Parallel_Split(func? trigger, func* action)|
|func? trigger||StopExpnsn(close the frontier field)|
|func* action||Call(type channel, wpg? next_wpg)|
|wpg? next||StopExpnsn(close the frontier field)|
|func* action||Call(type channel, wpg? next_wpg)|
|wpg? next||StopExpnsn(close the frontier field)|
|func* action||StopExpnsn(close the frontier field)|
The formal expression for parsing is Sequence ( Android. Any_Missed _Phone, Parallel_Split ( Watson_API. Voice_to_Text, SMS. Send_Text_to_Me, Google_Drive. Archive_Text_in_Spread_Sheet)).
We propose an approach of creating a training dataset and a test dataset for learning transition-based neural semantic parsing for a complex workflow with TAP chain. We basically suppose that the dataset is to be annotated manually. We use an existing TAP dataset which includes corresponding NL descriptions for TAPs. This dataset is beneficial because TAPs and NL descriptions are actually created and used by real users, and the NL descriptions enable the annotator to reuse them to create NL descriptions for the workflows as a whole.
In the IFTTT data, if a trigger function is called, then an action function is invoked. There can exist a case wherein when an action function in one TAP occurs, a trigger function in another TAP is fired simultaneously. We manually conducted such action-evoke-trigger annotations and determined the TAP chaining rules. This chain rule assumes the vertical expansion of TAP in a workflow (Figure 4).
On the other hand, the horizontal expansion of TAP is simple; that is, it is sufficient to execute multiple actions that have the same trigger (Figure 5). This makes it possible to create complex workflows from TAP chains.
A complex workflow is generated by randomly chaining TAPs. However, it is unclear whether this automatically generated workflow is really beneficial. Therefore, the automatically generated workflow is annotated in whether: (A) convenient and frequently used, (B) possible to use, or (C) inconvenient and not used. Each TAP that is an element of workflow generation is limited to the TAPs actually created and used by real users. In other words, for each TAP it is assumed that the combination of trigger and action is useful for some user.
Generation of NL instructions
Furthermore, the graphical form of the automatically generated workflow like Figure 2 is shown to an annotator, who is asked to annotate the instruction/description that should be given when they ask a machine to perform the workflow666In the annotation process, we first automatically generate instructions/descriptions by rule-based summarization. Then, annotators review and modify the instruction/description. It is possible to use sentence fusion models  to generate these.. Consequently, a pairing of an NL description and a workflow represented as WAST can be generated.
With respect to the model, we follow Yin2018b Yin2018b. Transition-based neural semantic parsing has an input of the NL utterance consisting of words . The parser outputs , one of the three transitions: “ApplyConstr[c] ” expresses the instruction to apply a WPG having constructor , “SelectMacr” means to generate a terminal token (function), and “StopExpnsn” means stop generating optional or sequential arguments.
The probability of generating WASTfrom NL instructions is:
The model is trained to maximize the log-likelihood of the transition sequence. Then, the best WAST is inferred from NL description using beam search.
In this paper, we assume a simple design where each thread progresses independently and focus only on a WAST form with a tree structure. In other words, flows that have branched once never rejoin. However, complex workflows usually include a simple merge workflow pattern, where branched flows will merge at some point in the following process. WPG needs to be extended to graph a structure.
This paper referred to general workflow patterns. With the spread of RPA, data on business processes have been accumulated. It is possible that there are common patterns across the companies. Therefore, it is useful to extract such common workflow patterns from real usage data of the RPA products and reduce them to WPG.
In this study, we defined a new grammar for chaining high abstraction level MEMRs for semantic parsing into complex workflows. We also proposed an approach to generate a dataset based on this grammar. Consequently, it is expected that an NL interface will be constructed for the complex workflow assumed by IPA. In the future, we intend to perform semantic parsing on the dataset created by this approach.
-  (2016-08) Improved semantic parsers for if-then statements. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Berlin, Germany, pp. 726–736. External Links: Cited by: TAP/IFTTT.
-  (2017-11) Dialog for language to code. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), Taipei, Taiwan, pp. 175–180. External Links: Cited by: TAP/IFTTT.
-  (2016-08) Language to logical form with neural attention. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Berlin, Germany, pp. 33–43. External Links: Cited by: TAP/IFTTT.
-  (2018-07) Confidence modeling for neural semantic parsing. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Melbourne, Australia, pp. 743–753. External Links: Cited by: TAP/IFTTT.
-  (2018-07) Neural semantic parsing. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, Melbourne, Australia, pp. 17–18. External Links: Cited by: Semantic Parsing.
-  (2019-11) Analyzing sentence fusion in abstractive summarization. In Proceedings of the 2nd Workshop on New Frontiers in Summarization, Hong Kong, China, pp. 104–110. External Links: Cited by: footnote 6.
-  (2016) Latent attention for if-then program synthesis. In Advances in Neural Information Processing Systems 29, D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett (Eds.), pp. 4574–4582. External Links: Cited by: TAP/IFTTT.
-  (2017) An empirical characterization of ifttt: ecosystem, usage, and performance. In Proceedings of the 2017 Internet Measurement Conference, IMC ’17, New York, NY, USA, pp. 398–404. External Links: Cited by: Expression Unit, TAP/IFTTT.
-  (2015-07) Language to code: learning semantic parsers for if-this-then-that recipes. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Beijing, China, pp. 878–888. External Links: Cited by: TAP/IFTTT.
-  (2016) Trigger-action programming in the wild: an analysis of 200,000 ifttt recipes. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, New York, NY, USA, pp. 3227–3231. External Links: Cited by: Expression Unit, TAP/IFTTT.
-  (2018) Interactive semantic parsing for if-then recipes via hierarchical reinforcement learning. CoRR abs/1808.06740. External Links: Cited by: TAP/IFTTT.
-  (2017-07) A syntactic neural model for general-purpose code generation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vancouver, Canada, pp. 440–450. External Links: Cited by: Expression Unit, Semantic Parsing.
-  (2018-11) TRANX: a transition-based neural abstract syntax parser for semantic parsing and code generation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Brussels, Belgium, pp. 7–12. External Links: Cited by: Expression Unit, Semantic Parsing.