One very useful feature of Google Analytics is the comparison of date ranges for metrics. You can compare number of sessions, bounce rate, E-commerce transactions, AdWords cost, etc. between adjacent weeks, months, or years. One shortcoming of this comparison feature is that the percentage change between date ranges is not qualified. If there is a 5% change compared to a previous time period, is this validation of the success of a marketing campaign, site changes, etc., or just due to randomness?
If you do a lot of DevOps deployments - let's say with terraform for example - or train machine learning models, one slightly minor hit on productivity is that they can take a while to run and you're not sure when they're done. While not knowing exactly when a task is done is pretty minor, it can add up over time and account for a good chunk of time.