You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As you may see, the resulting data still has duplicates due to the incorrect usage of the del operator within an active loop.
The suggestion is to use another approach, which will also filter out empty elements (and you can add other exclusions later):
DD_CUSTOM_TAGS="ddtags"DD_SOURCE="ddsource"defget_service_from_tags_and_remove_duplicates(metadata):
service=""tagsplit=metadata[DD_CUSTOM_TAGS].split(",")
services= [tagfortagintagsplitiftag.startswith("service:")]
ifservices:
# assumes all services are duplicates# service name is after `service:"service=services[0].split(":")[1]
print(f"SERVICES: {services}")
print(f"THE SERVICE: {service}")
tagsplit= [services[0]] + [itemforitemintagsplitifitemnotinservicesanditem!=""]
# the old part:# for i, tag in enumerate(tagsplit):# if tag.startswith("service:"):# if service:# print(f"DEL: #{i} == {tagsplit[i]}")# # remove duplicate entry from the tags# del tagsplit[i]# else:# service = tag[8:]# print(f"SERVICE: {service}")metadata[DD_CUSTOM_TAGS] =",".join(tagsplit)
print(f"{metadata}")
print(f"{tagsplit}")
# Default service to source valuereturnserviceifserviceelsemetadata[DD_SOURCE]
print ("{}".format(
get_service_from_tags_and_remove_duplicates({
DD_CUSTOM_TAGS:"a:b,service:bubernetes,service:shumbernetes,service:cucumbernetes,service:dumbernetes,c:d,e,f,,,",
DD_SOURCE:"test_source"
})))
Describe what happened:
This is how the function works now:
aws/logs_monitoring/parsing.py
The output:
As you may see, the resulting data still has duplicates due to the incorrect usage of the
del
operator within an active loop.The suggestion is to use another approach, which will also filter out empty elements (and you can add other exclusions later):
Describe what you expected:
A working duplicate filtration.
Steps to reproduce the issue:
See above.
The text was updated successfully, but these errors were encountered: