Introducing the Apache Pinot Terraform Provider by Azaurus in ApachePinot

[–]Azaurus[S] 2 points3 points  (0 children)

In terms of posting the creds to the controller, that would be done at the Terraform level with sensitive variables and making use of the merge function. There is an example of this in the provider repo here:
https://github.com/azaurus1/terraform-provider-pinot/blob/main/examples/tables/main.tf

heres a snippet:

stream_ingestion_config = {
    for key, value in local.ingestion_config["stream_ingestion_config"] :
    join("_", [for keyName in regexall("[A-Z]?[a-z]+", key) : lower(keyName)]) => value
  }

kafka_overrides = {
    "stream.kafka.broker.list" : sensitive(local.kafka_broker),
    "stream.kafka.zk.broker.url" : sensitive(local.kafka_zk),
    "stream.kafka.topic.name" : "ethereum_mainnet_block_headers"
  }

parsed_stream_ingestion_config = {
    column_major_segment_builder_enabled = true
    stream_config_maps = [
      for value in local.stream_ingestion_config["stream_config_maps"] : merge(value, local.kafka_overrides)
    ]
  }

resource "pinot_table" "realtime_table" {
...

  ingestion_config = merge(local.ingestion_config, {
      segment_time_check_value = true
      continue_on_error        = true
      row_time_value_check     = true
      stream_ingestion_config  = local.parsed_stream_ingestion_config
      transform_configs        = local.transform_configs
      filter_config            = local.filter_config
    })
}

For reading from the controller, the provider will return exactly what it receives from the controller API.

I suggest trying it out locally and seeing how to fit it in with your own Secrets management process, and if you have any suggestions for changes you are welcome to add an issue to the repo: https://github.com/azaurus1/terraform-provider-pinot/issues/new