Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change group to "datashim.io" from "com.ie.ibm.hpsys" #308

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
3aef31e
creating new branch for new group name
srikumar003 Jun 14, 2022
2f1df8b
initial support for s3 folders
srikumar003 Jun 25, 2023
db4b544
Merge branch 'master' into 200_support_s3_folders
srikumar003 Jun 26, 2023
012ce56
Merge branch 'master' into 178-migrate-crd-to-a-new-group
srikumar003 Aug 31, 2023
95525cb
changing CRD group to datashim.io
srikumar003 Sep 18, 2023
f3d060e
adding older group name for debugging
srikumar003 Jan 5, 2024
fe9a5eb
Merge branch 'master' into 200_support_s3_folders
srikumar003 Jan 5, 2024
a1b3303
adding support for building local component containers with podman
srikumar003 Jan 21, 2024
1f751b1
bug fixes in build bash scripts
srikumar003 Jan 21, 2024
d0826f2
bug fixes in build bash scripts
srikumar003 Jan 21, 2024
a04271c
bug fixes in build bash scripts
srikumar003 Jan 21, 2024
33b57c4
bug fixes in build bash scripts
srikumar003 Jan 21, 2024
9b44e69
bug fixes in build bash scripts
srikumar003 Jan 22, 2024
da3fc12
Merge branch 'master' into 200_support_s3_folders
srikumar003 Jan 22, 2024
9821af9
Merge branch '200_support_s3_folders' into 178-migrate-crd-to-a-new-g…
srikumar003 Jan 22, 2024
23acfc2
bug fixes in build bash scripts
srikumar003 Jan 22, 2024
0238cc6
Merge branch '200_support_s3_folders' into 178-migrate-crd-to-a-new-g…
srikumar003 Jan 22, 2024
0835c55
docker buildx load after build
srikumar003 Jan 22, 2024
8170af0
Merge branch '200_support_s3_folders' into 178-migrate-crd-to-a-new-g…
srikumar003 Jan 22, 2024
71b4089
docker buildx load after build
srikumar003 Jan 22, 2024
f7280f1
Merge branch '200_support_s3_folders' into 178-migrate-crd-to-a-new-g…
srikumar003 Jan 22, 2024
55d1ab0
Merge branch 'master' into 178-migrate-crd-to-a-new-group
srikumar003 Jan 23, 2024
8764225
Update README.md
srikumar003 Jan 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ jobs:
- name: Create sample Dataset
run: |
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,9 @@ _tmp
*s3-secret.yaml
.vscode/*
bin/*
chart/Chart.lock
chart/values.yaml.orig
example-dummy.yaml
example-pod.yaml
src/dataset-operator/testing/csi_test_suite.go
src/dataset-operator/testing/csi_test_utils.go
22 changes: 18 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@
A Kubernetes Framework to provide easy access to S3 and NFS **Datasets** within pods. Orchestrates the provisioning of
**Persistent Volume Claims** and **ConfigMaps** needed for each **Dataset**. Find more details in our [FAQ](https://datashim-io.github.io/datashim/FAQ/)

## Alert (23 Jan 2024) - Group Name Change

__If you have an existing installation of Datashim, please DO NOT follow the instructions below to upgrade it to version `0.4.0` or latest__. The group name of the Dataset and DatasetInternal CRDs (objects) is changing from `com.ie.ibm.hpsys` to `datashim.io`. An upgrade in place will invalidate your Dataset definitions and will cause problems in your installation. You can upgrade up to version `0.3.2` without any problems.

To upgrade to `0.4.0` and beyond, please a) delete all datasets safely; b) uninstall Datashim; and c) reinstall Datashim either through Helm or using the manifest file as follows.

## Quickstart

First, create the namespace for installing Datashim, if not present
Expand Down Expand Up @@ -56,7 +62,7 @@ and populate it with data._
We will create now a Dataset named `example-dataset` pointing to your S3 bucket.
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down Expand Up @@ -112,18 +118,26 @@ To install, search for the latest stable release
```bash
helm search repo datashim
```
which will result in:
```
NAME CHART VERSION APP VERSION DESCRIPTION
datashim/datashim-charts 0.3.2 0.3.2 Datashim chart
```

__Note:__Version `0.3.2` still has `com.ie.ibm.hpsys` as the apiGroup name. So, please proceed with caution. It is fine for upgrading from an existing Datashim installation but going forward the apiGroup will be `datashim.io`


Pass the option to create namespace, if you are installing Datashim for the first time:
```bash
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --version <version_string>
```
Do not forget to label the target namespace to support pod labels, as shown in the previous section

### Uninstalling through Helm

To uninstall, use `helm uninstall` like so:
```bash
helm uninstall -n datashim datashim
helm uninstall -n dlf datashim
```

### Installing intermediate releases
Expand All @@ -136,7 +150,7 @@ helm search repo datashim --devel

To install an intermediate version,
```bash
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --devel
helm install --namespace=dlf --create-namespace datashim datashim/datashim-charts --devel --version <version_name>
```

## Questions
Expand Down
2 changes: 1 addition & 1 deletion docs/Archive-based-Datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ minio-7979c89d5c-khncd 0/1 Running 0 3m
Now we can create a Dataset based on a remote archive as follows:
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/Ceph-Caching.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ dataset-operator-7b8f65f7d4-hg8n5 1/1 Running 0 6s
```
Create an s3 dataset by replacing the values and invoking `kubectl create -f my-dataset.yaml`
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ and populate it with data._
We will create now a Dataset named `example-dataset` pointing to your S3 bucket.
```yaml
cat <<EOF | kubectl apply -f -
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/kubeflow/Data-Volumes-for-Notebook-Servers.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ In this guide, we assume that your data are already stored in a remote s3 bucket
Let's assume that you will launch your notebook server on the namespace `{my-namespace}`

``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/kubeflow/Model-Storing-and-Serving-with-DLF.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Otherwise follow the instructions in [Configure IBM COS Storage](https://github.

Now we need to create a dataset to point to the newly created bucket. Create a file that looks like this:
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down
4 changes: 2 additions & 2 deletions docs/kubeflow/PVCs-for-Pipelines-SDK.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ We will just how you can adopt the examples located in [contrib/volume_ops](http

First you need to create a Dataset to point to the bucket you want to use. Create a file that looks like this:
``` yaml
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: your-dataset
Expand Down Expand Up @@ -138,7 +138,7 @@ def volume_op_dag():
def get_dataset_yaml(name,accessKey,secretAccessKey,endpoint,bucket,region):
print(region)
dataset_spec = f"""
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: {name}
Expand Down
2 changes: 1 addition & 1 deletion examples/hive/sampleapp/bookdataset.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: bookds
Expand Down
2 changes: 1 addition & 1 deletion examples/kubeflow/volumeop.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def volume_op_dag():
def get_dataset_yaml(name,accessKey,secretAccessKey,endpoint,bucket,region):
print(region)
dataset_spec = f"""
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: {name}
Expand Down
2 changes: 1 addition & 1 deletion examples/scheduling/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ First lets create a dataset
```
$ cat <<EOF | kubectl apply -f -
---
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-archive.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-host.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-nfs.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3-provision.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3-secrets.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion examples/templates/example-dataset-s3.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datasets.datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
2 changes: 1 addition & 1 deletion plugins/ceph-cache-plugin/deploy/role.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
verbs:
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-ibm-k8s.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-ibm-oc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf-oc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down
10 changes: 5 additions & 5 deletions release-tools/manifests/dlf.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasetsinternal.com.ie.ibm.hpsys
name: datasetsinternal.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: DatasetInternal
listKind: DatasetInternalList
Expand Down Expand Up @@ -257,9 +257,9 @@ metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.8.0
creationTimestamp: null
name: datasets.com.ie.ibm.hpsys
name: datasets.datashim.io
spec:
group: com.ie.ibm.hpsys
group: datashim.io
names:
kind: Dataset
listKind: DatasetList
Expand Down Expand Up @@ -565,7 +565,7 @@ rules:
verbs:
- get
- apiGroups:
- com.ie.ibm.hpsys
- datashim.io
resources:
- '*'
- datasetsinternal
Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions src/dataset-operator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ to a k8s cluster and allows you to create objects like the example
in [dataset_cr](deploy/crds/com_v1alpha1_dataset_cr.yaml)

```
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down Expand Up @@ -47,7 +47,7 @@ dataset-operator-644f8d854-dct95 1/1 Running 0 15s

Now you can do `kubectl create -f dataset.yaml` with a yaml which looks like this:
```
apiVersion: com.ie.ibm.hpsys/v1alpha1
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
name: example-dataset
Expand Down
Loading
Loading