Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion - 0.12.0 Change to Column handling and the downstream impacts #259

Closed
tdarwin opened this issue Jan 5, 2023 · 1 comment
Closed

Comments

@tdarwin
Copy link
Contributor

tdarwin commented Jan 5, 2023

Change being discussed

Prior to version 0.12.0, the terraform provider handled columns that already existed in a dataset by importing them under the covers, and then operating on them. In 0.12.0 the processed changed so that the provider will now receive a 409 error from the Honeycomb API and fail out due to that error.

Issue caused by change

This has resulted in impact to downstream honeycomb modules, as well as the Honeycomb tools that were developed to help customers migrate from Honeycomb Classic to Honeycomb Environments & Services. We're seeing breakage if these tools are used in environments where data exists already, which changes how the tool is utilized.

This isn't a bug - Why I wanted a discussion

This change aligns the provider with other provider best practices for Terraform in that you're not supposed to overwrite or modify or try to create something where something already exists but isn't in the state files. This is a safety precaution standard, and the "best practice", so shouldn't necessarily be reverted.

Discussion

How do we align going forward to ensure that customers hit 409 errors as infrequently as possible? Some of the options I've thought of:

Modules should default to not creating columns

We can update our modules and tools so that they have the ability to create new columns, but so that the creation is disabled by default, and users would only use it on environments where they know data doesn't exist yet. This would require some updates to the downstream modules, and some reworking of the migration tool.

Add an overwrite function/option to the column resource in the provider

We could add the old functionality back in but have it enabled/disabled via a flag in the resource. While it's Terraform best practice not to do this, Honeycomb objects are also not exactly infrastructure either. We'd also have to have this enabled by default to get downstream code working without modifications.

Both options?

Maybe we should do both the rewriting of downstream dependencies and the update of the resource?

Should we have data resources for columns and datasets

Maybe if we have data resources for columns and datasets available those could be used downstream to determine if things should be created or not, or we could modify terraformer to create data resources instead of standard column/dataset resources when doing a terraformer export of a current environment? This would resolve the issue for the migration utility most likely, but downstream modules would still need some reworking most likely.

Conclusion

I just wanted to bring this up, and have some discussion around this with a broader team, so that before I start going and doing a bunch of PRs to push things in a certain direction, a larger group of us agrees on the direction we should be going in.

@jharley
Copy link
Collaborator

jharley commented Jan 11, 2023

Public update after internal discussions:

This is potentially a more complex thing to solve well than it seems at face value. More time is needed to determine a solid solution prior to the "1.0 milestone" planned for early this year, but in support of improving the experience today:

@jharley jharley closed this as completed Jan 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants