You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey all! I am using Jupyter Notebook in a containerized environment where I can mount secrets as environment variables and use the exposed key/values in my notebook. When I want to run the same code as a pipeline, I need to define the same environment variables from Elyra configuration again. I totally understand that the pipeline I create has a life of its own once I submit it. It becomes another Kubernetes object (Argo Workflow or Tekton takes it and runs)..which makes sense, cause then I can run my pipelines without having to run my Notebook. But I was wondering if there is a way to utilize the same environment variables without having to define them in Elyra config again, like fetching the Notebook's variables..or does it make sense to have such feature ☺️
The text was updated successfully, but these errors were encountered:
Hey all! I am using Jupyter Notebook in a containerized environment where I can mount secrets as environment variables and use the exposed key/values in my notebook. When I want to run the same code as a pipeline, I need to define the same environment variables from Elyra configuration again. I totally understand that the pipeline I create has a life of its own once I submit it. It becomes another Kubernetes object (Argo Workflow or Tekton takes it and runs)..which makes sense, cause then I can run my pipelines without having to run my Notebook. But I was wondering if there is a way to utilize the same environment variables without having to define them in Elyra config again, like fetching the Notebook's variables..or does it make sense to have such feature☺️
The text was updated successfully, but these errors were encountered: