fission-workflows

module
v0.0.0-...-98ba599 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 2, 2020 License: Apache-2.0

README

Fission Workflows

Build Status Go Report Card Fission Slack

Fission Workflows


fission.io @fissionio

Fission Workflows is a workflow-based serverless function composition framework built on top of the Fission Function-as-a-Service (FaaS) platform.

⚠ Fission Workflows is currently in maintenance mode due to a time constraints of the core Fission team. If you are interested in contributing or helping maintain this project, contact the Fission team on the Fission Slack. ⚠

Highlights
  • Fault-Tolerant: Fission Workflows engine keeps track of state, re-tries, handling of errors, etc. By internally utilizing event sourcing, it allows the engine to recover from failures and continue exactly were it left off.
  • Scalable: Other than a backing data store, the workflow system is stateless and can easily be scaled. The independent nature of workflows allows for relatively straightforward sharding of workloads over multiple workflow engines.
  • High Performance: In contrast to existing workflow engines targeting other domains, Fission Workflows is designed from the ground up for low overhead, low latency workflow executions.
  • Extensible: All main aspects of the engine are extensible. For example, you can even define your own control flow constructs.
  • Lightweight: With just the need for a single data store (NATS Streaming) and a FaaS platform (Fission), the engine consumes minimal resources.

Philosophy

The Fission Functions-as-a-Service framework provides simplicity and quick time-to-value for functions on any infrastructure using Kubernetes.

Functions tend to do one logically separate task, and they're usually short-lived. For many relatively simple applications this is good enough. But a more complex application that uses serverless functions must, in some way, compose functions together.

There are several ways to compose functions. A function could invoke another function using the Fission API or HTTP. But this requires the calling function to handle serialization, networking, etc.

You could also set up functions to be invoked using message queue topics. This requires less boilerplate within each function, but the structure of the application is not explicit; dependencies are buried inside mappings of message queue topics to functions.

In addition, both these approaches are operationally difficult, in terms of error analysis, performance debugging, upgrades, etc.

Workflows have been popular in other domains, such as data processing and scientific applications, and recently got introduced into the serverless model by AWS Step Functions and Azure Logic Apps.

Fission Workflows is an open-source alternative to these workflow systems. It allows users to compose Fission functions in powerful ways. Users can define their own control flow constructs, which in most other workflow engines are part of the internal API.

Concepts

Workflows can generally be represented in as a Directed Acyclic Graph (DAG). Consider the following example, a common pattern, the diamond-shaped workflow:

Workflow Example

In this graph there is a single starting task A, a scatter task B triggering parallel execution of two branches with tasks C and D, followed by a synchronizing task E collecting the outputs of the parallel tasks.

Finally the graph concludes once final task F completes.

Although Fission Workflows has more functionality such as conditional branches and advanced control flow options, it fundamentally executes a dependency graph.

apiVersion: 1
output: WhaleWithFortune
tasks:
  GenerateFortune:
    run: fortune
    inputs: "{$.Invocation.Inputs.default}"

  WhaleWithFortune:
    run: whalesay
    inputs: "{$.Tasks.GenerateFortune.Output}"
    requires:
    - GenerateFortune

Task (also called a function here) is an atomic task, the 'building block' of a workflows.

Currently there are two options for executing tasks. First, Fission is used as the main function execution runtime, using fission functions as tasks. Second, for very small tasks, such as flow control constructs, internal functions execute within the workflow engine to minimize the network overhead.

A workflow execution is called a (Workflow) Invocation. The Fission Workflows engine assigns an UID invocation and stores it, persistently, in the data-store. This allows users to reference the invocation during and after the execution, such as to view the progress so far.

Finally, selectors and data transformations are inline functions which you can use to manipulate data without having to create a task for it. These inline functions consist of commonly used transformations, such as getting the length of an array or string. Additionally, selectors allow users to only pass through certain properties of data. In the example workflow, the JSONPath-like selector selects the default input of the invocation:

See the docs for a more extensive, in-depth overview of the system.

Usage
#
# Add binary environment and create two test function on your Fission setup 
#
fission env create --name binary --image fission/binary-env
fission function create --name whalesay --env binary --deploy examples/whales/whalesay.sh
fission function create --name fortune --env binary --deploy examples/whales/fortune.sh

#
# Create a workflow that uses those two functions; a workflow
# is just a function that uses the special "workflow" environment.
#
fission function create --name fortunewhale --env workflow --src examples/whales/fortunewhale.wf.yaml

#
# Map a HTTP GET to your new workflow function
#
fission route create --method GET --url /fortunewhale --function fortunewhale

#
# Invoke the workflow with an HTTP request
#
curl $FISSION_ROUTER/fortunewhale

See examples for other workflow examples.

Installation

See the installation guide.

Compiling

See the compilation guide.

Status and Roadmap

This is an early release for community participation and user feedback. It is under active development; none of the interfaces are stable yet. It should not yet be used in production!

Contributions are welcome in whatever form, including testing, examples, use cases, or adding features. For an overview of current issues, checkout the roadmap or browse through the open issues on Github.

Finally, we're looking for early developer feedback -- if you do use Fission or Fission Workflows, we'd love to hear how it's working for you, what parts in particular you'd like to see improved, and so on.

Talk to us on slack or twitter.

Directories

Path Synopsis
cmd
pkg
api
api/events
Code generated by hack/codegen-events.py.
Code generated by hack/codegen-events.py.
api/store
package store provides typed, centralized access to the event-sourced workflow and invocation models
package store provides typed, centralized access to the event-sourced workflow and invocation models
apiserver
Package apiserver contains all request handlers for gRPC and HTTP servers.
Package apiserver contains all request handlers for gRPC and HTTP servers.
apiserver/httpclient
Package httpclient is a lightweight implementation of a client for the HTTP gateway.
Package httpclient is a lightweight implementation of a client for the HTTP gateway.
fes
Package fes is a generated protocol buffer package.
Package fes is a generated protocol buffer package.
fes/backend/mem
package mem contains an implementation of the fes backend using an in-memory cache.
package mem contains an implementation of the fes backend using an in-memory cache.
fes/testutil
Package testutil is a generated protocol buffer package.
Package testutil is a generated protocol buffer package.
fnenv
Package fnenv provides interfaces to consistently communicate with 'function runtime environments' (fnenvs).
Package fnenv provides interfaces to consistently communicate with 'function runtime environments' (fnenvs).
fnenv/mock
Package mock contains a minimal, mocked implementation of a fnenv for test purposes
Package mock contains a minimal, mocked implementation of a fnenv for test purposes
fnenv/native
Note: package is called 'native' because 'internal' is not an allowed package name.
Note: package is called 'native' because 'internal' is not an allowed package name.
fnenv/workflows
package workflows exposes the workflow engine itself as a function environment to improve recursion.
package workflows exposes the workflow engine itself as a function environment to improve recursion.
scheduler
Package scheduler is a generated protocol buffer package.
Package scheduler is a generated protocol buffer package.
types
Package types is a generated protocol buffer package.
Package types is a generated protocol buffer package.
types/typedvalues
package typedvalues provides a data container for annotating, interpreting, and transferring arbitrary data.
package typedvalues provides a data container for annotating, interpreting, and transferring arbitrary data.
types/typedvalues/controlflow
Package controlflow adds support for workflows and tasks (together "flows") to TypedValues.
Package controlflow adds support for workflows and tasks (together "flows") to TypedValues.
types/typedvalues/httpconv
package httpconv provides methods for mapping TypedValues to and from HTTP requests and responses.
package httpconv provides methods for mapping TypedValues to and from HTTP requests and responses.
types/validate
Validate package contains validation functions for the common structures used in the workflow engine, such as Workflows, Tasks, WorkflowInvocations, etc.
Validate package contains validation functions for the common structures used in the workflow engine, such as Workflows, Tasks, WorkflowInvocations, etc.
util/gopool
package gopool provides functionality for bounded parallelism with goroutines
package gopool provides functionality for bounded parallelism with goroutines
util/labels
Package labels provides storing, fetching and matching based on labels.
Package labels provides storing, fetching and matching based on labels.
util/mediatype
Package mediatype implements the IANA Media Type standard.
Package mediatype implements the IANA Media Type standard.
util/pubsub
Package pubsub is a simple, label-based, thread-safe PubSub implementation.
Package pubsub is a simple, label-based, thread-safe PubSub implementation.
util/workqueue
package workqueue is a amended copy of k8s' workqueue implementation Changes made: - workqueue.go - Added MaxSize field to default workqueue.
package workqueue is a amended copy of k8s' workqueue implementation Changes made: - workqueue.go - Added MaxSize field to default workqueue.
version
Code generated by hack/codegen-version.sh.
Code generated by hack/codegen-version.sh.
test

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL