components

package
v2.11.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 22, 2024 License: Apache-2.0 Imports: 12 Imported by: 0

README

Component Integration

The components dir of the codebase is hosts all the component specific logic of the operator. Since, ODH operator is an integration point to deploy ODH component manifests it is essential to have common processes to integrate new components.

Integrating a new component

To ensure a component is integrated seamlessly in the operator, follow the steps below:

Add Component to DataScienceCluster API spec

DataScienceCluster CRD is responsible for defining the component fields and exposing them to end users. Add your component to it's api spec:

type Components struct {
   NewComponent newcomponent.newComponentName `json:"newcomponent,omitempty"`
}
Add Component module
  • Add a new module, <newComponent>, under components/ directory to define code specific to the new component. Example can be found here
  • Define Path and ComponentName variables for the new component.
Implement common Interface
  • Define struct that includes a shared struct Component with common fields.

  • Implement interface methods according to your component

    type ComponentInterface interface {
      ReconcileComponent(ctx context.Context, cli client.Client, logger logr.Logger, owner metav1.Object, DSCISpec *dsciv1.DSCInitializationSpec, currentComponentStatus bool) error
      Cleanup(cli client.Client, DSCISpec *dsciv1.DSCInitializationSpec) error
      GetComponentName() string
      GetManagementState() operatorv1.ManagementState
      OverrideManifests(platform string) error
      UpdatePrometheusConfig(cli client.Client, enable bool, component string) error
      ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger
    }
    
Add reconcile and Events
  • Once you set up the new component module, add the component to Reconcile function in order to deploy manifests.
  • This will also enable/add status updates of the component in the operator.
Reconcile Workflow

Component Reconcile Workflow.png

Add Unit and e2e tests
  • Components should add unit tests for any component specific functions added to the codebase
  • Components should update e2e tests to capture deployments introduced by the new component

Integrated Components

Documentation

Overview

+groupName=datasciencecluster.opendatahub.io

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Component

type Component struct {
	// Set to one of the following values:
	//
	// - "Managed" : the operator is actively managing the component and trying to keep it active.
	//               It will only upgrade the component if it is safe to do so
	//
	// - "Removed" : the operator is actively managing the component and will not install it,
	//               or if it is installed, the operator will try to remove it
	//
	// +kubebuilder:validation:Enum=Managed;Removed
	ManagementState operatorv1.ManagementState `json:"managementState,omitempty"`

	// Add developer fields
	// +optional
	// +operator-sdk:csv:customresourcedefinitions:type=spec,order=2
	DevFlags *DevFlags `json:"devFlags,omitempty"`
}

Component struct defines the basis for each OpenDataHub component configuration. +kubebuilder:object:generate=true

func (*Component) Cleanup added in v2.3.0

func (*Component) ConfigComponentLogger added in v2.10.0

func (c *Component) ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger

extend origal ConfigLoggers to include component name.

func (*Component) DeepCopy added in v2.7.0

func (in *Component) DeepCopy() *Component

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new Component.

func (*Component) DeepCopyInto added in v2.7.0

func (in *Component) DeepCopyInto(out *Component)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

func (*Component) GetManagementState added in v2.2.0

func (c *Component) GetManagementState() operatorv1.ManagementState

func (*Component) UpdatePrometheusConfig added in v2.7.0

func (c *Component) UpdatePrometheusConfig(_ client.Client, enable bool, component string) error

UpdatePrometheusConfig update prometheus-configs.yaml to include/exclude <component>.rules parameter enable when set to true to add new rules, when set to false to remove existing rules.

type ComponentInterface

type ComponentInterface interface {
	ReconcileComponent(ctx context.Context, cli client.Client, logger logr.Logger,
		owner metav1.Object, DSCISpec *dsciv1.DSCInitializationSpec, currentComponentStatus bool) error
	Cleanup(cli client.Client, DSCISpec *dsciv1.DSCInitializationSpec) error
	GetComponentName() string
	GetManagementState() operatorv1.ManagementState
	OverrideManifests(platform string) error
	UpdatePrometheusConfig(cli client.Client, enable bool, component string) error
	ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger
}

type DevFlags added in v2.2.0

type DevFlags struct {
	// List of custom manifests for the given component
	// +optional
	Manifests []ManifestsConfig `json:"manifests,omitempty"`
}

DevFlags defines list of fields that can be used by developers to test customizations. This is not recommended to be used in production environment. +kubebuilder:object:generate=true

func (*DevFlags) DeepCopy added in v2.7.0

func (in *DevFlags) DeepCopy() *DevFlags

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new DevFlags.

func (*DevFlags) DeepCopyInto added in v2.7.0

func (in *DevFlags) DeepCopyInto(out *DevFlags)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type ManifestsConfig added in v2.2.0

type ManifestsConfig struct {
	// uri is the URI point to a git repo with tag/branch. e.g.  https://github.com/org/repo/tarball/<tag/branch>
	// +optional
	// +kubebuilder:default:=""
	// +operator-sdk:csv:customresourcedefinitions:type=spec,order=1
	URI string `json:"uri,omitempty"`

	// contextDir is the relative path to the folder containing manifests in a repository
	// +optional
	// +kubebuilder:default:=""
	// +operator-sdk:csv:customresourcedefinitions:type=spec,order=2
	ContextDir string `json:"contextDir,omitempty"`

	// sourcePath is the subpath within contextDir where kustomize builds start. Examples include any sub-folder or path: `base`, `overlays/dev`, `default`, `odh` etc.
	// +optional
	// +kubebuilder:default:=""
	// +operator-sdk:csv:customresourcedefinitions:type=spec,order=3
	SourcePath string `json:"sourcePath,omitempty"`
}

Directories

Path Synopsis
Package codeflare provides utility functions to config CodeFlare as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
Package codeflare provides utility functions to config CodeFlare as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
Package dashboard provides utility functions to config Open Data Hub Dashboard: A web dashboard that displays installed Open Data Hub components with easy access to component UIs and documentation +groupName=datasciencecluster.opendatahub.io
Package dashboard provides utility functions to config Open Data Hub Dashboard: A web dashboard that displays installed Open Data Hub components with easy access to component UIs and documentation +groupName=datasciencecluster.opendatahub.io
Package datasciencepipelines provides utility functions to config Data Science Pipelines: Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Tekton +groupName=datasciencecluster.opendatahub.io
Package datasciencepipelines provides utility functions to config Data Science Pipelines: Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Tekton +groupName=datasciencecluster.opendatahub.io
Package kserve provides utility functions to config Kserve as the Controller for serving ML models on arbitrary frameworks +groupName=datasciencecluster.opendatahub.io
Package kserve provides utility functions to config Kserve as the Controller for serving ML models on arbitrary frameworks +groupName=datasciencecluster.opendatahub.io
+groupName=datasciencecluster.opendatahub.io
+groupName=datasciencecluster.opendatahub.io
Package modelmeshserving provides utility functions to config MoModelMesh, a general-purpose model serving management/routing layer +groupName=datasciencecluster.opendatahub.io
Package modelmeshserving provides utility functions to config MoModelMesh, a general-purpose model serving management/routing layer +groupName=datasciencecluster.opendatahub.io
Package modelregistry provides utility functions to config ModelRegistry, an ML Model metadata repository service +groupName=datasciencecluster.opendatahub.io
Package modelregistry provides utility functions to config ModelRegistry, an ML Model metadata repository service +groupName=datasciencecluster.opendatahub.io
Package ray provides utility functions to config Ray as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
Package ray provides utility functions to config Ray as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
Package trainingoperator provides utility functions to config trainingoperator as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists
Package trainingoperator provides utility functions to config trainingoperator as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists
Package trustyai provides utility functions to config TrustyAI, a bias/fairness and explainability toolkit +groupName=datasciencecluster.opendatahub.io
Package trustyai provides utility functions to config TrustyAI, a bias/fairness and explainability toolkit +groupName=datasciencecluster.opendatahub.io
Package workbenches provides utility functions to config Workbenches to secure Jupyter Notebook in Kubernetes environments with support for OAuth +groupName=datasciencecluster.opendatahub.io
Package workbenches provides utility functions to config Workbenches to secure Jupyter Notebook in Kubernetes environments with support for OAuth +groupName=datasciencecluster.opendatahub.io

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL