'

The State of Developer Productivity

Понравилась презентация – покажи это...





Слайд 0

PR EV IE W DEVELOPER PRODUCTIVITY REPORT 2015 JAVA PERFORMANCE SURVEY RESULTS M ake y Kessel our app d o R un i n t he less t h twel ve parsec an s! All rights reserved. 2015 © ZeroTurnaround Inc. 1


Слайд 1

All rights reserved. 2015 © ZeroTurnaround Inc. 2


Слайд 2

1 Figure 1.16 Answers were mulitple choice, so the numbers don't add up to 100%. Deal with it :) 1 All rights reserved. 2015 © ZeroTurnaround Inc. 3


Слайд 3

Another important metric to recognize is that there isn’t a single killer tool that does everything. In fact, if we look at the number of tools used by our respondents, almost half of those that took the survey claimed to use more than one performance profiler for their application. On average, 1.7 tools were used picked by each respondent. All rights reserved. 2015 © ZeroTurnaround Inc. Figure 1.13 4


Слайд 4

The respondent, aka you! As we’d expect, the vast majority of those who responded to our survey labeled themselves software developers, as shown in figure 1.1. A further 27% sit in the Architect/Team Lead/ Project Manager category. This constitutes well over 90% of those who responded, so we can rest assured that we’re talking to the techies! Interestingly, only 1.54% of those who responded were dedicated performance engineers, which either means there are very few performance engineers out there, or that perhaps we didn’t penetrate that specific market with the survey. It’s often difficult to get a survey out to an audience without experiencing bias, and this could be a result of bias from our market reach. All rights reserved. 2015 © ZeroTurnaround Inc. Figure 1.1 5


Слайд 5

The Application As you’d expect, the vast majority of applications are going to be web applications, shown in figure 1.2 as taking over 70% of the audience. Desktop applications come in at a little over 11% with batch and mobile at just over 6% and 3.5% respectively. In the “Other” column, respondents mentioned other applications such as middleware or application servers. So we can now picture our typical respondent as a software developer who develops a web application. Should we name him/her? How about Sam? Actually, it’s probably better in the long run that we don’t get too attached to them, so we’ll continue calling them the respondent. Figure 1.2 All rights reserved. 2015 © ZeroTurnaround Inc. 6


Слайд 6

The Organization Let’s find out where our average respondent works. w We can start to understand the average organization a little better by understanding how big the teams that develop, test and support the application are. Figure 1.4 shows the distribution of people working on their application from design and development through to deployment and support. The average is 21.27 people. Remember, this is the arithmetic mean. There isn’t 0.27 of a person trying their best to develop an application using two of their available fingers and 3 toes on their one good leg! With over half of our respondents (55%) stating they work for a team of less than 9 people and 83% of respondents stating they work in a team of less than 25, it’s safe to say that the teams are mostly small. There will always be outliers which pull the average up as organizations turn enterprise, but these are the minority cases for this survey. Figure 1.4 All rights reserved. 2015 © ZeroTurnaround Inc. 7


Слайд 7

Next, we turn to responsibility. We can see straight away from figure 1.5 that as far as responsibility goes, the clear winner sits with the development teams. The data shows that at just over 55%, the most likely person to be responsible for performance testing is whoever wrote that piece of code. This could be for a couple of reasons. Perhaps we really do live in an agile world and the engineer who develops the code fully tests it - functionally, as well as for performance. Another possibility is that there isn’t any process at all. This would mean that performance testing is just dumped on the poor developer who has little support or knowledge about performance testing, so ultimately ignores it. While we all wish that would not be the case, it does creep into mind. Operations, Performance Teams and QA take up similar slices, as well as the answer ”Nobody” at around 11-14%. Figure 1.5 All rights reserved. 2015 © ZeroTurnaround Inc. 8


Слайд 8

Figure 1.7 While there is obvious value in performance testing against a production or staging environment, if we can shift as much as possible to the left, putting in effort earlier in the cycle, we should expect better applications that are cheaper to develop. All rights reserved. 2015 © ZeroTurnaround Inc. Figure 1.7 shows that 37% of respondents test their application for performance while it’s being developed. That’s a great statistic and shows how important people believe it is to test early. The most common phase to run performance tests was during system/integration testing. Overall, there is a reasonable spread throughout the application lifecycle which is reassuring. As a multiple choice question, the average number of options selected by a respondent was 1.87. 9


Слайд 9

Figure 1.8 Irrespective of who finds the performance problems, it’s clear from figure 1.8 who fixes the issues that emerge from performance testing. With almost 94%, it’s the developers that apply the fix. Yeh, the same people who wrote the code in the first place. As a result, it makes even more sense to performance test your code as you develop it, as the fix might as well be applied while the bug is being written! All rights reserved. 2015 © ZeroTurnaround Inc. 10


Слайд 10

Now we move to the question of test frequency. Profiling is an important part of performance testing, so that we can really understand execution paths, bottlenecks and so forth. However, figure 1.9 shows this as an extremely reactive activity. Over 40% of respondents state they profile their code only when issues arise, rather than something that’s typically done on a regular basis. Other than that answer, the results are largely spread across the other options with almost one in ten stating they never profile their code at all. All rights reserved. 2015 © ZeroTurnaround Inc. Figure 1.9 11


Слайд 11

Sadly, our next data shows that one in four respondents (26%) spend no time on performance testing during a release. Perhaps this says something about the types or size of changes that tend to go into releases. Over 55% of respondents state they spend 1-5 days performance testing each release, which is much more encouraging. The remaining respondents test for 1 or more weeks each release. The Toolset The proverb suggests that a good worker never blames their tools. We’ll not be looking into how effective each tool is just yet, but we will look at what is being used. VisualVM seemed an extremely popular choice as did JProfiler and Java Mission Control, the performance tool bundled with Oracle’s distribution of Java since version 7u40. Interestingly, 20% of respondents state they have their own custom in-house tools. Developers love to develop, right! XRebel is also worth mentioning with over 3% of the votes, particularly because it’s barely a year old! Figure 1.11 All rights reserved. 2015 © ZeroTurnaround Inc. 12


Слайд 12

I have Issues! Let’s now look at the issues themselves - the bugs behind all the complaints! First of all, how are they caught? Figure 1.14 shows the variety of paths a bug could take to become noticed by you. The most common (31%) way is via user reports and user feedback. In short, we as an industry are failing to test properly. Also, 20% of respondents say that some of their system faults or crashes are a result of performance issues. The vast majority of the remaining half (46%) of issues are found through tooling, including Performance monitoring tools, like APMs, Profilers and home grown software. Figure 1.14 All rights reserved. 2015 © ZeroTurnaround Inc. 13


Слайд 13

The Fix We know more about the issue now, so let’s try to understand the fix a little more. Different people will take different amounts of time to fix different issues in different ways. I guess I’m trying to say we’re all different, so let’s look at how long we tend to take for finding, fixing, and testing issues. Figure 1.18 shows that the most common answer was 0-1 days. This can be considered pretty fast, but of course it all depends on the complexity of the issue. Over half the respondents (52%) take less than half a week to fix an issue, but as you can see from the graphic, there’s a pretty long tail which pulls the average out to just under a week at 4.38 days. Just over one in four respondents (27%) claimed their fixes took a full week or more to diagnose, fix and test. It will be interesting to see who takes longer to fix the bugs later. *Spoiler* it is indeed interesting! All rights reserved. 2015 © ZeroTurnaround Inc. Figure 1.18 14


Слайд 14

t Co ntac us Twitter: @RebelLabs Web: http://zeroturnaround.com/rebellabs Email: labs@zeroturnaround.com Estonia Ülikooli 2, 4th floor Tartu, Estonia, 51003 Phone: +372 653 6099 USA 399 Boylston Street, Suite 300, Boston, MA, USA, 02116 Phone: All rights reserved. 2015 © ZeroTurnaround Inc. 1 (857) 277-1199 Report Author: Simon Maple (@sjmaple) Czech Republic Jankovcova 1037/49 Building C, 5th floor, 170 00 Prague 7, Czech Republic Phone: +420 227 020 130 Report Contributors: Oleg Shelajev (@shelajev), Debbie Moynihan (@debdeb), Mart Redi (@martredi)​ Report Designer: Ladislava Bohacova​ (@ladislava) 15


Слайд 15


×

HTML:





Ссылка: