You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just ran table_tagger.py against a few hundred DynamoDB tables. The console output made me feel like a real DevOps champion, having logs that every single one of my tables was tagged.
However, I reviewed CloudTrail logs and found that my work had not happened as fatefully as I had thought. Almost half of the calls logged some sort of error...
Drilling down for all the errors, they are all the same:
The script needs to be enhanced so that:
it does not log successes for API calls that were not successfully executed
it does not exceed the rate limits for the AWS API
The text was updated successfully, but these errors were encountered:
Is your requirement for tagging the tables for granular cost in Cost Explorer, if so, there is no longer a need to do this. I'll update the READ ME later this week.
Hey @LeeroyHannigan thanks for taking a look.
You might be thinking about the available "Resource" extra breakdown in cost-explorer, but that's only good for 2 weeks...
I just ran table_tagger.py against a few hundred DynamoDB tables. The console output made me feel like a real DevOps champion, having logs that every single one of my tables was tagged.
However, I reviewed CloudTrail logs and found that my work had not happened as fatefully as I had thought. Almost half of the calls logged some sort of error...
Drilling down for all the errors, they are all the same:
The script needs to be enhanced so that:
The text was updated successfully, but these errors were encountered: