Performance Clinic: Why Devs Love Dynatrace – Episode 3 – Automated Release Comparison


Thursday, June 24th
4pm BST / 5pm CEST / 11am ET

In the previous episodes you learned how Dynatrace automates observability from Dev to Ops environments, how to analyze distributed traces on real time (we call them PurePaths) to detect code or performance hotspots and how to automate code evaluation as part of your development process and delivery pipelines. You learned about automated performance validation and release comparison through intelligent quality gates and were introduced into the concepts of SLOs (Service Level Objectives).

In Episode 3, Sergio Hinojosa focuses on Automated Release Comparison. We will learn the foundations of Load Test Analysis with Dynatrace. We will compare failed releases that did not pass the Quality Gate due performance degradations. We will show how easy it is to compare functional performance tests with each other learning how Dynatrace can pinpoint the degradations up to code-level through the application stack. We will even decompile the code to understand the root cause of a bad implementation whether it is a bad algorithm or thread synchronization issue.

Register now
Speakers
Sergio Hinojosa
Sales Engineering Manager EMEA at Dynatrace
Andreas Grabner
Global Technology Lead at Dynatrace