Complexity and us

Common marketing tool used for big data is that human mind is unable to analyze when more than 7 variables are involved. That seems an overstretch when I see many discussions where there is struggle with just one variable.

This topic of complexity is of great interest to me because I see this as main driver of human and technical evolution. Further in my profession of performance engineering it is at abundant display.

Complexity is an invariable outcome of wide and conflicting demands and finding solutions for everything in one product/service. At philosophical level complexity of products and services is a reflection of human being and nature's complexity and diversity.


Real life complexity


At more mundane level, at which I operate there is love-hate relationship between compexity and simplicity. Those who master complexity feel vastly entitled while those challenged by complexity loathe it and profess superiority of simplicity aka common sense. I feel I would be prejudiced and anti-evolutionary if I were to take sides here. What transpires in real life is far more complex. An inter-mesh occurs between complexity and simplicity to provide wholesome products and services that stand test of time.

On one level this complexity-simplicity interface creates many jobs with vast majority of lower and middle management serving prime purpose of simplified events reporting to facilitate decision making. It creates market for software and consultants who come to aid in grappling with complexity. Then are special categories of folks who traverse across domains and technologies to keep it all linked and cohesive.

On another level this brings in innovation whereby complexity is progressively abstracted to present simple interface. So as we see in computer engineering there is hardware>os> hypervisor>jvm> fundamental apis>  application apis> server interface> user interface chain.


For 'progressive abstraction' to work there is need of vendors/groups working at every level and have a commercial case to continue their activities. I have personally never been able decipher how some of the intermediate groups stay in black unless they implement 'lock-in' features.


The amazing part of of this whole complexity-simplicity tango is that it creates more complexity in bid to simplify. One reason for that is that for any solution to be successful it has to cater to large swathes, which again brings in complexity. This progressive abstraction is creating huge stretch on proportion of being unmanageable. For an entire stretch to be successful there has to be someone at least who understands the big picture. Now this progressive abstraction promotes staying in one level
reducing the number of folks who have experienced big picture and connect all layers to make the solution work in totality. I have seen that very commonly. Earlier generation developers would be ready to fix any defect, be it in application side or database side etc. Now a defect would just bounce between groups.

The issue with stretch is that as long as it works, the arrangement is beautiful. However when it unravels it is unmanageable. Performance defects involving multi-layer issue is one example. Another difficult unravelling example is from subprime mortgage crisis of 2008.  There were mortgages sold off as mortgaged based securities, which were then supported by CDO which was backed by credit default swaps which is further amped by 'netting' transactions.


Performance Engineering and complexity

Linking back to my profession of performance engineering the complexity of options is befuddling. Consider a scenario where a probable performance issue is reported there are large number of options that need to be evaluated which as challenging as final act of resolution. Here are some of the things we end up grappling with when an issue is reported and it is our responsibility to address all and keep approach aligned to find acceptable solution in minimum time-frame. Let's say issue is of drop in throughput at certain load, then you could end up confronting following challenges.

1. Test is too futuristic (who knows future?)
2. The code we took to test is not right or still undergoing change ( is it ever stable?)
3. How it worked in earlier test (misuse fear uncertainty doubt to change the narrative to your advantage)
4. Why two test results are not similar (are our systems designed to worked predictably? OS allocations are based on probability and priorities)
5. Business risk not clearly articulated (are the quantified business benefits of good performance available?)
6. Test environment is limited and we can't extrapolate (is it ever exactly same?)
7. Run it again (lack of confidence on unspecified  factors or buying more time)
8. Test data not right or realistic (easy picking, just like vendors)
9. Tester lack of skill (so scripts, setup etc may not be correct)
10. Architecture /design not right
11. Requirement beyond reason (since system does tend to behave well in patches)
12. Reality mismatch (never saw such behavior before, never seeing this outside performance test reports or in logs)
13. Throw more hardware right away
14. Diagnose (more database time but ouch more queries but ouch more calls thus more io and connections not enough. Anything can be blamed- design/connection pool settings/jdbc settings/ database settings/inadequate caching)

Whats Next?
Complexity cannot be wished away though we can hide it giving the illusion of simplicity. I would be putting some more thoughts on art and science of 'progressive abstraction' at later point of time.

Comments

Popular posts from this blog

Performance engineering, the Restaurant analogy

The power of questioning and observation