In reading over the noted section on multi-stage decision making and the formulas that researchers use, I have to admit I'm impressed that researchers were able to work out decision making to the point where it wasn't just simply a recommend series of steps or a thought process that could be broken down into an acronym, but that they were able to lay out a numerical and data driven formula. Impressive as it may be though, I am very far from that same level of detail and would say I more often operate on a very basic set of heuristics as the text may call it.
Most of my day to day decision making can include recognizing potential problems in projects or solving problems that come up (and back brief my leadership accordingly, only pulling guidance if its a very hot issue), but often times I mainly need to decide whom to task with getting a problem worked and the manner in which I keep my supervisors in the loop. As comes inherent to the military profession, much of the technical expertise on the particulars of how a problem gets solved is taken care of by more experienced enlisted personnel to whom I give the broader objectives, and to whom I give any necessary authorizations to execute their solutions. Given that information and action flow, I'm inclined to think the implementation of decision formulas would be unfeasible, if not outright problematic. Thus, my decision making usually boils down to simply finding out what gadget needs fixing and delegating the job to the appropriate shop, or consulting with my flight chief or other senior enlisted for more complex matters such as appropriate disciplinary measures (which can also often times be dictated for us either by the Commander or Air Force regulations). As noted in the text on page 43, I'm generally using heuristics that weigh my options, generally weighing between action or inaction within different contexts.
Upon further examination it seems like the text focuses on examples of decision making in the context of events that have repeating occurrences or decision points, often with statistical elements, that in my opinion don't translate particularly well to decision making in the context of leading people and managing projects. Having said that, there are still good points that I believe have universal application in avoiding "myopia" and hurdles such as cognitive limitation and concreteness bias. I will admit that one of my greatest difficulties in working IT management is that often times we're talking about computer programs and processes -- something that I can't easily see or touch beyond perhaps drawing out a diagram. Even so, those aspects as well as the limited physical tasks of moving systems and plugging them in are things I often never touch and can only deal with conceptually due to what is expected from my role as a manager vice that of a technician.
Being conscious of another potential pitfall though, is generally speaking a good thing. Although maintaining a long term view isn't natural in day to day decision making, particularly at my level, I think it would be beneficial to think about second and third order effects next time I'm dealing with a larger scale decision. Further, knowing that there really is a thing as concreteness bias, that provides further impetus to ask the right questions to try and solidify concepts that at first glance have no inherent solidity. Although it would be rather difficult to use the text's process in a sensible manner to my duties, considering the potential obstacles to thinking may well improve my ability to anticipate and mitigate issues, and perhaps help me help my workcenter create better deliverables the first time around without a lot of back and forth between our level and senior leadership.
No comments:
Post a Comment