Jump to content

Draft:DAWM Methodology Workshop

From Wikipedia, the free encyclopedia
  • Comment: Issues not addressed quite yet... ToadetteEdit (talk) 07:02, 24 October 2024 (UTC)


Integrated Deliverable Approach Workshop Methodology with Idea Evaluation

[edit]

Workshops are a recognised as a strategic tool for planning and developmen, frequently employed in scenario planning or for envisioning future growth strategies.[1] Their objectives include providing insights into future scenarios, encouraging strategic planning, encourage growth opportunities, improve team collaboration, enhance skill development, share research insights, and resolve conflicts.[2]

The Deliverable Approach Workshop Methodology (DAWM) is a structured framework designed to support business processes by guiding teams through complex challenges and promoting collaboration—efforts towards a unified vision within a dynamic business environment.

Workshop Overview?

[edit]

DAWM workshops bring together technical staff, stakeholders, and management, to address tasks associated with business growth, project management, planning, requirements, and user experience design. Central to DAWM is the facilitator, who orchestrates communication and collaboration, enhancing both productivity and effectiveness.

Three cycles from DAWM Methodology

Success of Facilitated Workshops

[edit]

The DAWM workshop methodology comprises three cycles, each integral to business strategy formulation:[3]

1. "What it is?": Evaluating current circumstances.

2. "What it might be?": Exploring potential future scenarios.

3. "Strategy": Focussing on the overall corporate approach and risk management.

DAWM Workshops

[edit]

The Deliverable Approach Workshop Methodology (DAWM) is a structured format for workshops that integrates both qualitative and quantitative methods, typically accommodates up to twelve participants, including one facilitator and an assistant.[4]

The DAWM workshop utilises the BRIDGE and CRUX approaches to support verbal presentation with a one-page handout and a forum for questions. [5] The facilitator is required to allocate time for participants to share their thoughts and suggestions, prompt passive attendees, and disseminate key points to colleagues. Following the workshop, the facilitator instructs the group to document additional input in a survey report and utilises the box-to-box methodology for analysis.[4]

Workshop Crux Analysis Approach

[edit]

The Crux methodology utilised a structured framework to ensure that the most critical issues are central to strategic decision-making. A systematic approach to identifying, prioritising, and addressing challenges is essential for navigating the complexities of modern business environments.[6] This approach identifies, prioritizes, and addresses key issues, enabling companies to develop strategies aligned with their core values and long-term objectives.[7] The CRUX methodology involves several key steps:[8]

1. Identifying Challenges - Compiling input from stakeholders to create a challenge list.

2. Decomposing Challenges - Breaking down issues into manageable components.

3. Clustering Related Challenges - Grouping similar challenges to uncover common themes.

4. Filtering Challenges - Removing non-critical issues.

5. Prioritizing Challenges - Ranking challenges by urgency and impact.

6. Rating Challenge Importance - Assessing the potential effect of each challenge.

7. Identifying Crux Challenges - Focusing on the most significant, complex issues.

CRUX Results for Deliverables

[edit]

The CRUX analytical approach assists organizations in converting diverse challenges into a prioritized set of actionable items. By focusing on critical issues—those posing substantial risks or opportunities—organizations can more effectively align their strategic initiatives. This methodology facilitates prompt attention to immediate concerns while also reinforcing long-term strategic objectives.[9]

Crux Matrix analyses

Evaluation Key

[edit]

MCDA Processing

[edit]

The Multi-Criteria Decision Analysis (MCDA) framework supports business prioritization by helping organizations allocate resources effectively and make timely decisions. The Consensual Assessment Technique (CAT)[10] is used to evaluate product creativity, even with challenges related to time and subjective interpretation. Through contextualization within creative workshops, MCDA aids decision-makers in identifying optimal compromises when a single, ideal solution is not possible.[11]

The evaluation process encompasses defining context, assessing ideas, applying MCDA, processing, and debating outcomes, taking into account various contextual elements like originality and feasibility. For an idea to be successful, it generally must satisfy criteria such as:

  1. Applicability.
  2. Value.
  3. Feasibility.
  4. Innovation.

Decision-makers or industry specialists may tailor or refine these criteria as needed.[12]

Different approaches to weight assignment enable balanced evaluations, including individual assessment, expertise-based criteria evaluation, and consensus-based scoring.[13]

MCDA allows for thorough assessment by utilizing individual, aggregated, or consensus-driven evaluations, which ultimately support prioritizing urgent issues and organizing related challenges into clusters.[13]

In MCDA, decision-making relies on a multi-criteria structure, involving a set of m alternatives that need to be ranked and a set of n criteria to optimize. This process is represented in a decision matrix where each matrix element reflects an alternative’s performance on a particular criterion . [14]

The approach includes pairwise comparison of alternatives based on each criterion, with preference indicated on a scale from 0 to 1—where 0 represents indifference and 1 reflects strong preference.[14]

The decision-maker determines the generalised criterion, which quantifies the difference in performance or preference. represents a generalised preference function where:

  • A weighting vector quantifies the relative significance of each condition:

  • For all of the alternatives, the preference relation π is as follows:

  • To calculate the uni-criterion flows, analyse the valued outranking graph and determine the flow leaving each node α, as described by:

  • And the flow that is coming in is:

  • Calculation of business flow:

Integrating MCDA with DAWM

[edit]

The Multi-Criteria Decision Analysis (MCDA) approach seeks to guide decisions that reflect the decision-maker's expectations by using deductive reasoning and mathematical computations within a consultation-oriented matrix template.[11] The ultimate choice of which ideas and actions to pursue remains the responsibility of the decision-maker, who can leverage these results as a basis for collaborative discussions in the Deliverable Approach Workshop Methodology (DAWM). This method encourages balanced integration of both rational analysis and intuitive judgment, incorporating validated business risks and competitor insights.[12]

In one startup case study, a flexible risk matrix enabled the risk management team to adjust ratings in alignment with the organization’s structure and strategic goals, providing a custom approach to address specific needs and priorities.[15]

Risk analyeses
Rating Probability Impact scale
0.2 Low No real impact
0.4 Medium Smal to medium reduction cost or time
0.6 Medium-High Medium to large reduction the cost or time
0.8 High Unacceptable over budget or behind schedule <20%
1 Fact Unacceptable over budget or behind schedule ≥20%
  • Insert risk impact weight in the MCDA:

  • Performed the DAWM:
    1. The Deliverable Approach Workshop Methodology (DAWM) is completed through a matrix-based competitor analysis that scores alternatives from low to high or 0% to 100%.
    2. The alternative with the highest score is prioritized, even if differences between options are slight.
    3. Rankings are established by evaluating proposed ideas within each participant group, calculating averages, and determining net flows.
    4. Subsequently, rankings were established by computing combined scores based on specified preferences and weights, resulting in overall flows.
    5. Business growth within this methodology is linked to effective risk management, helping focus on key areas while minimizing attention to less critical growth opportunities.

References

[edit]
  1. ^ Hodgkinson, Gerard P.; Whittington, Richard; Johnson, Gerry; Schwarz, Mirela (October 2006). "The Role of Strategy Workshops in Strategy Development Processes: Formality, Communication, Co-ordination and Inclusion". Long Range Planning. 39 (5): 479–496. doi:10.1016/j.lrp.2006.07.003.
  2. ^ Gandrita, Daniel Mandel (2023-09-22). "Improving Strategic Planning: The Crucial Role of Enhancing Relationships between Management Levels". Administrative Sciences. 13 (10): 211. doi:10.3390/admsci13100211. ISSN 2076-3387.
  3. ^ Schalken, J. (2004). "Assessing the effects of facilitated workshops in requirements engineering". "8th Internation Conference on Empirical Assessment in Software Engineering (EASE 2004)" Workshop - 26th International Conference on Software Engineering. Vol. 2004. IEE. pp. 135–143. doi:10.1049/ic:20040406. ISBN 978-0-86341-435-0.
  4. ^ a b Marriott, Brigid R.; Rodriguez, Allison L.; Landes, Sara J.; Lewis, Cara C.; Comtois, Katherine A. (December 2015). "A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops". Implementation Science. 11 (1): 62. doi:10.1186/s13012-016-0429-z. ISSN 1748-5908. PMC 4859972. PMID 27154000.
  5. ^ Faggion Jr, Clovis Mariano (2023-09-22). "The need for clear criteria for the selection of participants in scientific workshops". British Dental Journal. 235 (6): 379–382. doi:10.1038/s41415-023-6325-4. ISSN 0007-0610. PMID 37737403.
  6. ^ Govier, Eloise (December 2020). "Power and all its guises. Environmental determinism and locating 'the crux of the matter'". Archaeological Dialogues. 27 (2): 173–176. doi:10.1017/S1380203820000215. ISSN 1380-2038.
  7. ^ MacDonald, Erin Faith; Gonzalez, Richard; Papalambros, Panos (December 2009). "The construction of preferences for crux and sentinel product attributes". Journal of Engineering Design. 20 (6): 609–626. doi:10.1080/09544820802132428. ISSN 0954-4828.
  8. ^ Rekatsinas, Theodoros; Deshpande, Amol; Parameswaran, Aditya (2019-11-03). "CRUX: Adaptive Querying for Efficient Crowdsourced Data Extraction". Proceedings of the 28th ACM International Conference on Information and Knowledge Management. ACM. pp. 841–850. doi:10.1145/3357384.3357976. ISBN 978-1-4503-6976-3.
  9. ^ Dikert, Kim; Paasivaara, Maria; Lassenius, Casper (September 2016). "Challenges and success factors for large-scale agile transformations: A systematic literature review". Journal of Systems and Software. 119: 87–108. doi:10.1016/j.jss.2016.06.013.
  10. ^ Baer, John (2017-01-01), Karwowski, Maciej; Kaufman, James C. (eds.), "Chapter 14 - Why You are Probably More Creative (and Less Creative) Than You Think", The Creative Self, Explorations in Creativity Research, San Diego: Academic Press, pp. 259–273, doi:10.1016/b978-0-12-809790-8.00014-5, ISBN 978-0-12-809790-8, retrieved 2024-06-27
  11. ^ a b Shaw, D (July 2003). "Evaluating electronic workshops through analysing the 'brainstormed' ideas". Journal of the Operational Research Society. 54 (7): 692–705. doi:10.1057/palgrave.jors.2601568. ISSN 0160-5682.
  12. ^ a b Shaw, D (July 2006). "Journey Making group workshops as a research tool". Journal of the Operational Research Society. 57 (7): 830–841. doi:10.1057/palgrave.jors.2602155. ISSN 0160-5682.
  13. ^ a b Schulte, Jesko; Hallstedt, Sophie I. (2018). "Workshop Method for Early Sustainable Product Development". Proceedings of the DESIGN 2018 15th International Design Conference. Vol. 15. pp. 2751–2762. doi:10.21278/idc.2018.0209. ISBN 978-953-7738-59-4.
  14. ^ a b Gabriel, A.; Camargo, M.; Monticolo, D.; Boly, V.; Bourgault, M. (November 2016). "Improving the idea selection process in creative workshops through contextualisation". Journal of Cleaner Production. 135: 1503–1513. Bibcode:2016JCPro.135.1503G. doi:10.1016/j.jclepro.2016.05.039.
  15. ^ "Risk Assessment Matrix: Overview and Guide". AuditBoard. Retrieved 2024-08-08.