-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Properly mask all sensitive fields in Consumer and Producer settings #100
Properly mask all sensitive fields in Consumer and Producer settings #100
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it be possible to add a unit test or 2?
They already exist, see and Now what the CVE is alluding to is the fact that there are more fields than the original implementation and hence what is just being tested in those unit tests. Should I try and find another field that is considered sensitive but doesn't have a key containing password and create a unit test for it? |
51a5327
to
0c688a9
Compare
@mdedetrich the CVE relates to |
@pjfanning I have updated the already existing unit tests to check for an additional sensitive config key called |
I can add a unit test for this if you really want but then I have to manually construct the value so it satisfies to the config parser and with the modified unit test I did its really not necessary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
All tests pass, merging PR. |
Resolves: #85
This PR reuses the upstream Kafka mechanism that is used to filter/mask passwords which means its 100% correct (the
.properties
fields being printed in.toString
refers to configuration that is passed to proper Kafka). In more detail, Kafka has its ownorg.apache.kafka.common.config.AbstractConfig
which actually maintains types for keys. One of those types iskafka.common.config.ConfigDef.Type.PASSWORD
which is how upstream Kafka determines whether any sensitive field needs to be masked (see https://github.com/apache/kafka/blob/51bc41031b91b16ffcbd95832e8a20a00d3b6f81/clients/src/main/java/org/apache/kafka/common/config/types/Password.java#L54-L57 for more info).Also note that this project already has tests for determining if the filtering is working (see https://github.com/mdedetrich/incubator-pekko-connectors-kafka/blob/0a55b158d2d7fd8c050ab147d80d259a79e3ffe2/tests/src/test/scala/org/apache/pekko/kafka/ConsumerSettingsSpec.scala#L107-L117 and https://github.com/mdedetrich/incubator-pekko-connectors-kafka/blob/0a55b158d2d7fd8c050ab147d80d259a79e3ffe2/tests/src/test/scala/org/apache/pekko/kafka/ProducerSettingsSpec.scala#L91-L101) so I can confirm that it works (i.e. there is no regression).