Skip to content

Commit

Permalink
Change group to "datashim.io" from "com.ie.ibm.hpsys" (#308)
Browse files Browse the repository at this point in the history
* creating new branch for new group name

Signed-off-by: Srikumar Venugopal <srikumarv@ie.ibm.com>

* initial support for s3 folders

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* changing CRD group to datashim.io

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* adding older group name for debugging

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* adding support for building local component containers with podman

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* bug fixes in build bash scripts

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* docker buildx load after build

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* docker buildx load after build

Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>

* Update README.md

Update README to incorporate group name change

---------

Signed-off-by: Srikumar Venugopal <srikumarv@ie.ibm.com>
Signed-off-by: SRIKUMAR VENUGOPAL <srikumarv@ie.ibm.com>
  • Loading branch information
srikumar003 committed Jan 23, 2024
1 parent 37f7783 commit f590a7f
Show file tree
Hide file tree
Showing 35 changed files with 91 additions and 70 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ jobs:
- name: Create sample Dataset
run: |
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,9 @@ _tmp
*s3-secret.yaml
.vscode/*
bin/*
chart/Chart.lock
chart/values.yaml.orig
example-dummy.yaml
example-pod.yaml
src/dataset-operator/testing/csi_test_suite.go
src/dataset-operator/testing/csi_test_utils.go
22 changes: 18 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@
A Kubernetes Framework to provide easy access to S3 and NFS **Datasets** within pods. Orchestrates the provisioning of
**Persistent Volume Claims** and **ConfigMaps** needed for each **Dataset**. Find more details in our [FAQ](https://datashim-io.github.io/datashim/FAQ/)

## Alert (23 Jan 2024) - Group Name Change

__If you have an existing installation of Datashim, please DO NOT follow the instructions below to upgrade it to version `0.4.0` or latest__. The group name of the Dataset and DatasetInternal CRDs (objects) is changing from `com.ie.ibm.hpsys` to `datashim.io`. An upgrade in place will invalidate your Dataset definitions and will cause problems in your installation. You can upgrade up to version `0.3.2` without any problems.

To upgrade to `0.4.0` and beyond, please a) delete all datasets safely; b) uninstall Datashim; and c) reinstall Datashim either through Helm or using the manifest file as follows.

## Quickstart

First, create the namespace for installing Datashim, if not present
Expand Down Expand Up @@ -56,7 +62,7 @@ and populate it with data._
We will create now a Dataset named `example-dataset` pointing to your S3 bucket.
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down Expand Up @@ -112,18 +118,26 @@ To install, search for the latest stable release
```bash
helm search repo datashim
```
which will result in:
```
NAME CHART VERSION APP VERSION DESCRIPTION
datashim/datashim-charts 0.3.2 0.3.2 Datashim chart
```

__Note:__Version `0.3.2` still has `com.ie.ibm.hpsys` as the apiGroup name. So, please proceed with caution. It is fine for upgrading from an existing Datashim installation but going forward the apiGroup will be `datashim.io`


Pass the option to create namespace, if you are installing Datashim for the first time:
```bash
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --version <version_string>
```
Do not forget to label the target namespace to support pod labels, as shown in the previous section

### Uninstalling through Helm

To uninstall, use `helm uninstall` like so:
```bash
helm uninstall -n datashim datashim
helm uninstall -n dlf datashim
```

### Installing intermediate releases
Expand All @@ -136,7 +150,7 @@ helm search repo datashim --devel

To install an intermediate version,
```bash
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --devel
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --devel --version <version_name>
```

## Questions
Expand Down
2 changes: 1 addition & 1 deletion docs/Archive-based-Datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ minio-7979c89d5c-khncd 0/1 Running 0 3m
Now we can create a Dataset based on a remote archive as follows:
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/Ceph-Caching.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ dataset-operator-7b8f65f7d4-hg8n5 1/1 Running 0 6s
```
Create an s3 dataset by replacing the values and invoking `kubectl create -f my-dataset.yaml`
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ and populate it with data._
We will create now a Dataset named `example-dataset` pointing to your S3 bucket.
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/kubeflow/Data-Volumes-for-Notebook-Servers.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ In this guide, we assume that your data are already stored in a remote s3 bucket
Let's assume that you will launch your notebook server on the namespace `{my-namespace}`

``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/kubeflow/Model-Storing-and-Serving-with-DLF.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Otherwise follow the instructions in [Configure IBM COS Storage](https://github.

Now we need to create a dataset to point to the newly created bucket. Create a file that looks like this:
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down
4 changes: 2 additions & 2 deletions docs/kubeflow/PVCs-for-Pipelines-SDK.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ We will just how you can adopt the examples located in [contrib/volume_ops](http

First you need to create a Dataset to point to the bucket you want to use. Create a file that looks like this:
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down Expand Up @@ -138,7 +138,7 @@ def volume_op_dag():
def get_dataset_yaml(name,accessKey,secretAccessKey,endpoint,bucket,region):
print(region)
dataset_spec = f"""
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: {name}
Expand Down
2 changes: 1 addition & 1 deletion examples/hive/sampleapp/bookdataset.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: bookds
Expand Down
2 changes: 1 addition & 1 deletion examples/kubeflow/volumeop.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def volume_op_dag():
def get_dataset_yaml(name,accessKey,secretAccessKey,endpoint,bucket,region):
print(region)
dataset_spec = f"""
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: {name}
Expand Down
2 changes: 1 addition & 1 deletion examples/scheduling/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ First lets create a dataset
```
$ cat <<EOF | kubectl apply -f -
---
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-archive.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-host.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-nfs.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3-provision.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3-secrets.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datasets.datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion plugins/ceph-cache-plugin/deploy/role.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
verbs:
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-ibm-k8s.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-ibm-oc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-oc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions src/dataset-operator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ to a k8s cluster and allows you to create objects like the example
in [dataset_cr](deploy/crds/com_v1alpha1_dataset_cr.yaml)

```
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down Expand Up @@ -47,7 +47,7 @@ dataset-operator-644f8d854-dct95 1/1 Running 0 15s

Now you can do `kubectl create -f dataset.yaml` with a yaml which looks like this:
```
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
Loading

0 comments on commit f590a7f

Please sign in to comment.