When perfect planning kills wildlife: Why conservation teams must "do something" despite uncertainty

    Quick answer

    The "Do something" principle is a conservation project management approach that encourages teams to begin addressing threats to biodiversity despite information gaps, rather than delaying action until perfect knowledge is achieved. While gathering complete information, biodiversity targets continue degrading.

    This principle helps teams balance evidence-based decision-making with the urgency of conservation by prioritizing work packages with highest confidence, conducting research during implementation, and managing uncertainty through structured change processes. Applied correctly, it prevents analysis paralysis while maintaining scientific rigor.

    The "Do something" principle is one of four core principles in the Project Management for Wildlife Conservation best practice framework — the standard, peer-reviewed methodology used by conservation professionals worldwide to deliver measurable biodiversity impact.

    UNLOCK OUR FULL BEST PRACTICES AND GET CERTIFIED CONSERVATION SKILLS

    Ready to go deeper? Build practical skills for wildlife conservation by exploring our expert-led courses designed to help you apply what you’ve learned in real-world contexts. From career development to technical conservation tools, our training is built to support your next step.

    Contents

      Why standard project management fails conservation

      Conservation teams face an impossible dilemma: they need solid evidence to design effective interventions, but gathering that evidence takes time. Time during which poaching continues. Habitat degrades. Populations decline. Species edge closer to extinction.

      Conservation projects tmay take 18-36 months from initial concept to project start. During this planning period, teams conduct baseline assessments, carry out threat analyses, study human behaviour drivers, pilot intervention approaches, and develop comprehensive project plans.

      These activities are valuable. Evidence-based conservation is essential.

      But here's the uncomfortable truth: while you're perfecting your project plan, the biodiversity you're trying to save continues degrading. Every month spent researching optimal anti-poaching patrol routes is another month poachers kill wildlife. Every week spent surveying community attitudes toward habitat destruction is another week that habitat gets destroyed.

      The "Do something" principle addresses this dilemma by rejecting the false choice between "perfect knowledge before action" and "reckless action without knowledge." Instead, it offers a third path: strategic action despite imperfect knowledge, combined with adaptive management as you learn.

      This doesn't mean abandoning evidence-based conservation. It means accepting that conservation happens in complex, uncertain contexts where perfect information is impossible, and that thoughtful action with 70% confidence beats delayed action with 95% confidence when biodiversity is actively degrading.

        The hidden cost of analysis paralysis in conservation

        Analysis paralysis occurs when conservation teams delay starting projects because they're still gathering information, refining strategies, or waiting for research results. This feels responsible — after all, shouldn't conservation be based on solid evidence?
        But analysis paralysis has severe costs:

        Biodiversity degradation continues: The most obvious cost. If local villagers are poisoning lions now, waiting 18 months to fully understand all threats facing lions means 18 more months of lion poisoning. The lions you could have saved are dead. No amount of perfect planning brings them back.

        Opportunity costs accumulate: The staff time, funding, and organizational capacity dedicated to research and planning could have been directed toward reducing active threats. Even partially effective action generates some conservation benefit; pure planning generates none.

        Stakeholder relationships deteriorate:
        Communities whose cooperation you need see you conducting endless surveys and assessments but delivering no tangible action. Their patience wears thin. Their willingness to collaborate erodes. By the time your perfect plan is ready, the relationships you need to implement it may be damaged.

        Funding windows close:
        Donors don't wait indefinitely. That grant opportunity requiring action this year disappears. That foundation interested in your work moves to other priorities. Perfect plans are worthless if you have no resources to implement them.
        Team motivation declines: Conservation professionals are driven by impact, not process. Teams stuck in perpetual planning cycles lose energy, talent leaves for organizations actually doing conservation, and the organizational culture shifts from action-oriented to bureaucratic.

        The irony is that analysis paralysis feels like responsible project management. It looks like due diligence, scientific rigor, and careful stewardship of conservation resources. But when biodiversity is actively degrading, perfectionism is a form of negligence.

          What "do something" actually means (and what it doesn't)

          The "Do something" principle is frequently misunderstood. Let's be clear about what it does and doesn't mean:

          It DOES mean:

          - Starting work to address threats with the best available information, accepting some uncertainty
          - Prioritizing work packages you're most confident will achieve impact, even if other work packages remain uncertain
          - Beginning action on high-confidence interventions while conducting research to inform lower-confidence ones
          - Using structured change management processes to adapt your approach as you learn
          - Making explicit decisions about acceptable levels of uncertainty before starting work

          It DOES NOT mean:

          - Random action without any evidence or planning
          - Ignoring available information because "we'll figure it out as we go"
          - Reckless experimentation that might harm biodiversity or communities
          - Refusing to conduct research or monitoring
          - Dismissing evidence-based conservation principles

          The principle requires judgment. Every project team must decide what level of uncertainty they're willing to accept before starting. That decision should be influenced by:
          Urgency of threat:How quickly is biodiversity degrading? Acute threats justify higher uncertainty tolerance. Slow-moving threats allow more planning time.

          Reversibility of action: Can your intervention be stopped or reversed if it proves ineffective or harmful? High reversibility justifies higher uncertainty tolerance.

          Severity of potential harm: What's the worst-case outcome if your intervention fails? Low harm risk justifies higher uncertainty tolerance.

          Opportunity costs: What conservation benefit could be achieved during the time spent gathering perfect information? High opportunity costs justify higher uncertainty tolerance.

          Information gathering feasibility: How long will it take to reduce uncertainty to acceptable levels? If perfect information requires 3 years and your intervention could start in 3 months, the trade-off may not be worth it.

          Example: A project team working to save Sumatran tigers discovers that local villagers are poisoning tigers in retaliation for livestock attacks. The team has high confidence (verified through interviews) that poisoning occurs, but lower confidence about whether poisoning is the primary threat or whether other threats (habitat loss, prey depletion) are equally significant.

          Applying "Do something": Start addressing the poisoning immediately (community engagement, livestock protection, rapid response programs) because you have high confidence it's a real threat causing active tiger deaths. Simultaneously conduct research on other potential threats to inform work packages scheduled for later project phases. Don't delay all action until you've assessed every possible threat with perfect confidence.

            3 strategies for applying "do something" while maintaining rigor

            Strategy 1: Prioritize work packages by confidence level
            Rather than waiting until all work packages are equally evidence-based, sequence them by your confidence that they'll achieve planned impact:

            Immediate start (high confidence): Work packages where strong evidence shows the intervention will reduce threats. This might be based on verified studies from similar contexts, clear cause-effect relationships with high-quality data, or pilot projects proving effectiveness. Start these immediately.

            Phased start (medium confidence): Work packages where reasonable evidence suggests effectiveness but significant uncertainties remain. These might be based on less specific studies, expert opinion, or logical inference with some data gaps. Start these in later phases, giving time to reduce uncertainty through research.

            Contingent start (low confidence): Work packages where evidence is weak or contradictory. These might be based on assumptions, limited information, or untested approaches. Hold these as contingencies, only implementing if research confirms their necessity and likely effectiveness.

            This approach ensures you're always taking the most defensible action available while continuously improving the evidence base for future work. You're "doing something" (the high-confidence work) while responsibly managing uncertainty (researching medium and low-confidence work).

            Example: A marine conservation project plans to reduce dynamite fishing. High confidence work: alternative fishing gear provision (verified effective in similar contexts): start immediately. Medium confidence work: alternative livelihood development (unclear if alternative livelihoods alone change behaviour) — start in phase 2 after baseline economic assessment. Low confidence work: awareness campaigns (limited evidence these change behaviour for economically motivated threats) — hold as contingency, only implement if research shows attitudes are indeed a primary driver.

            Strategy 2: Embed research into implementation

            Rather than completing all research before starting any action, integrate research activities into your implementation schedule. This creates an action-learning cycle where implementation generates data that informs subsequent work.

            Implementation generates information:
            Carrying out work packages reveals information you couldn't have discovered through pre-project research. How do communities actually respond to your intervention? What unexpected challenges emerge? Which behaviour change mechanisms work in practice versus theory? Implementation is itself a form of research.

            Monitoring fills information gaps:
            Your monitoring and evaluation activities should be explicitly designed to answer the specific uncertainties identified during planning. If you're uncertain whether alternative livelihoods will reduce poaching, your monitoring framework should track both livelihood adoption AND poaching behaviour changes, testing your theory of change in real-time.

            Adaptive management responds to learning:
            As implementation and monitoring generate new information, your structured change management processes (risks, issues, opportunities, lessons learned) enable adapting your approach. This isn't "making it up as you go" It's systematic adaptation based on emerging evidence.

            This approach transforms uncertainty from a barrier to action into a reason for structured learning. You acknowledge information gaps explicitly, plan to fill them through implementation and monitoring, and create processes to adapt based on what you learn.
            Example: A forest conservation project believes (medium confidence) that illegal logging is driven by farmers needing income during agricultural off-season. Rather than conducting a 12-month economic study before starting, the project: (1) begins a pilot alternative livelihood programme for 20 farmers, (2) monitors both livelihood participation AND illegal logging behaviour changes, (3) conducts simple economic surveys with participants to understand what actually drives their logging decisions, and (4) adapts the programme design based on findings before scaling up.

            Strategy 3: Use structured change management for uncertainty

            Rather than eliminating uncertainty before starting (impossible), manage it actively during implementation using the managing change control process.

            Document information gaps as risks: When you identify significant uncertainties during planning, document them as project risks with clear ratings based on probability and impact. This makes uncertainty explicit rather than hidden, enables monitoring whether the risk materializes, and triggers appropriate responses if it does.

            Track emerging issues: When implementation reveals your assumptions were wrong, document this as an issue (something that has happened) rather than a risk (something that might happen). Issues trigger immediate response planning to adapt your approach.
            Capture lessons learned: As you learn what works and what doesn't through implementation, document lessons learned that can inform future work package design, project decisions, or subsequent projects. This transforms implementation experience into institutional knowledge.

            Identify opportunities:
            Sometimes uncertainty resolves favorably — a risk doesn't materialize, a partner offers unexpected support, or an intervention proves more effective than expected. Document these as opportunities and adjust your project plan to exploit them.
            This approach acknowledges that conservation happens in uncertain, changing contexts. Rather than waiting for certainty that will never come, you create robust processes to detect when your assumptions prove wrong and adapt accordingly.

            Example: A wetland restoration project has low confidence whether restored wetlands will attract the target migratory bird species (risk: "restored habitat remains unused by target species, probability: medium, impact: high"). The project proceeds with restoration (better restored unused habitat than continued degradation) but implements intensive bird monitoring to detect if the risk materializes. Six months in, monitoring shows birds are using the habitat but less than expected (risk becomes issue: "lower than expected bird usage"). This triggers an adaptation plan investigating why usage is low (perhaps adjacent threats remain) and adjusting work packages accordingly.

              How "do something" fits into the broader framework

              The "Do something" principle works in concert with the other three project management principles:

              Focus on impact ensures that "doing something" means addressing threats, not just staying busy with activities. Teams that apply "do something" without "focus on impact" generate lots of action but no measurable conservation results. The two principles together create purposeful urgency. Start work that will measurably reduce threats and improve biodiversity status, don't wait for perfect planning.

              Take responsibility clarifies who decides when to start work despite uncertainty and who's accountable for managing that uncertainty during implementation. Without clear decision-making authority, "do something" debates become endless. Every team member has a different opinion about acceptable uncertainty levels. Designated decision-makers can cut through debate and authorize action.

              Embrace change
              provides the mechanism for adapting when "do something" reveals your assumptions were wrong. Starting work despite uncertainty requires robust change management processes. If you're taking action based on imperfect information, you need structured ways to detect when that information proves incorrect and adapt accordingly.
              Together, these four principles create a management approach that is simultaneously:

              - Urgent (do something)
              - Purposeful (focus on impact)
              - Clear (take responsibility)
              - Adaptive (embrace change)

              This enables teams to deliver measurable conservation impact in complex, uncertain, rapidly changing contexts — which describes every real-world conservation project.

                Getting started: applying "do something" in your next project

                For your next conservation project, try this minimum viable approach:

                1. Complete enough planning to start confidently: Conduct threat assessment to identify major threats. Trace back to understand key behaviour drivers. Develop your current situation and planned change diagrams. This provides sufficient evidence to start work. It doesn't provide perfect knowledge of every threat, behaviour, influence, and intervention option. That's acceptable.

                2. Rate your confidence explicitly: For each planned work package, assign a confidence rating (very high, high, medium, low) showing how confident you are it will achieve the planned impact. This forces honest assessment of where you're on solid ground versus making educated guesses.

                3. Sequence work by confidence: Start very high and high confidence work packages immediately. Schedule medium confidence packages for later phases, giving time to conduct supporting research. Hold low confidence packages as contingencies, only implementing if monitoring confirms they're necessary.

                4. Document information gaps as risks: For each medium or low confidence work package, document the specific uncertainty as a risk: "Work package X may fail to achieve planned result Y because [information gap Z]." This makes uncertainty explicit and trackable.

                5. Embed research in your work plan: Schedule research activities (surveys, monitoring, pilot tests) to run alongside implementation, explicitly targeting the information gaps you identified. These activities should be designed to test your assumptions and reduce uncertainty for later work packages.

                6. Create honest monitoring indicators: Don't just monitor activity completion (did we deliver training?) or basic outputs (how many people attended?). Monitor actual results (did knowledge increase? did behaviour change? did threats reduce?). Honest monitoring reveals when your approach isn't working, enabling adaptation.

                7. Use change processes religiously: Hold regular status meetings to review risks, issues, opportunities, and lessons learned. Don't let these become tick-box exercises. Actually use the information to adapt your project plan when evidence shows your assumptions were wrong.

                8. Communicate uncertainty to stakeholders: Be transparent with funders, partners, and team members about what you know with high confidence and what you're uncertain about. Explain your approach for managing uncertainty. This builds trust and manages expectations far better than pretending certainty you don't have.

                This approach enables starting meaningful conservation work within 3-6 months rather than 18-36 months, while maintaining scientific rigor and stakeholder trust. You're not abandoning evidence-based conservation — you're applying it appropriately in uncertain contexts.

                  Advanced application: when NOT to "do something"

                  The "Do something" principle has limits. Sometimes delay is appropriate:
                  When potential harm is severe and irreversible: If your intervention could cause significant harm to biodiversity or human communities, and that harm cannot be easily reversed, then higher confidence is required before starting. For example, introducing a new species to control a pest requires extensive research because introductions cannot be undone.

                  When existing action is already effective: If other organizations or programmes are already effectively addressing the key threats, adding your efforts may be redundant or counterproductive. Better to support existing work or focus on unaddressed threats than duplicate efforts.

                  When time is on your side: Some conservation contexts are relatively stable; biodiversity isn't in acute danger, threats are slow-moving or seasonal, stakeholder relationships are strong. In these rare situations, taking additional time to gather information may be appropriate. But be honest about whether time is truly on your side.

                  When capacity doesn't exist: If your organization lacks the skills, resources, or permissions to implement even high-confidence work packages, starting prematurely creates failure and damages credibility. In this case, delay action while building capacity through partnerships, training, or fundraising.

                  When the intervention is experimental: Some conservation approaches are genuinely novel with no evidence base from similar contexts. Deploying these at scale without piloting is inappropriate. But note: pilot-scale deployment IS "doing something" ; the principle would encourage small-scale trials rather than full-scale rollout, not waiting indefinitely before any trial.
                  Applying the principle wisely means recognizing these boundary conditions while avoiding using them as excuses for unnecessary delay.

                    UNLOCK OUR FULL BEST PRACTICES AND GET CERTIFIED CONSERVATION SKILLS

                    Ready to go deeper? Build practical skills for wildlife conservation by exploring our expert-led courses designed to help you apply what you’ve learned in real-world contexts. From career development to technical conservation tools, our training is built to support your next step.

                    FAQ

                    How much information is "enough" to start a project? 

                    Enough information to confidently identify at least one major threat, understand the behaviours driving it, and have reasonable confidence (supported by evidence) that your planned work packages will address those behaviours. You don't need perfect understanding of all threats, all behaviours, all influences, and all possible work packages. Start with what you can defend, learn through implementation.

                    What if our organization requires 18-month baseline assessments before starting?

                    Challenge that requirement. Ask: What is the conservation cost of the 18-month delay? Could we implement high-confidence work while conducting baseline assessment? Could we phase the baseline assessment, starting with critical data and gathering supplementary data during implementation? Organizations should design policies that serve conservation outcomes, not create bureaucratic barriers.

                    How do we convince donors to fund projects with acknowledged uncertainty?

                    By demonstrating you understand and manage uncertainty professionally. Show evidence for your high-confidence work packages. Explain your phased approach for medium-confidence ones. Document your risk management and adaptive management processes. Donors respect teams that honestly acknowledge uncertainty and have robust processes to manage it far more than teams that pretend false certainty.

                    Won't starting too early waste resources if we have to change direction? 

                    Resources are also wasted by delaying too long. The question is: Which waste is larger — the resources spent adapting when you learn versus the conservation opportunity lost during delay? Usually, adaptation costs are far smaller than delay costs. Plus, resources spent on implementation (even if later adapted) generate some conservation benefit and learning. Resources spent only on planning generate neither.

                    How do we balance "do something" with "do no harm"?

                    Apply both principles together. Before starting work packages, assess potential negative effects on other biodiversity or human communities (do no harm). If negative effects are unacceptably high even with mitigation, don't implement that work package (appropriate application of caution). If negative effects are absent, low, or mitigable, proceed despite information gaps (do something). The principles aren't contradictory — they're complementary filters for appropriate action.

                    What if team members disagree about acceptable uncertainty levels? 

                    This is why "take responsibility" matters. Designate a decision-maker (usually the Project manager) who has authority to decide when confidence is sufficient to start. Gather input from the team, but don't let it become endless debate. Once the decision is made, everyone moves forward together. Teams that argue endlessly about "enough information" never start.

                    How do we know if we've delayed too long? 

                    Warning signs: Your planning phase exceeds 12 months. Stakeholders ask repeatedly when you'll start. Team morale drops because people joined to do conservation, not planning. Biodiversity indicators show continued degradation. Donors lose interest. Other organizations start working on the same threats. Funding opportunities pass. If any of these apply, you've likely delayed too long.

                    Can "do something" apply to research projects? 

                    es, but differently. For research projects, "doing something" means starting data collection despite incomplete research protocols, using preliminary findings to inform final methodology. It doesn't mean abandoning research rigor. Even research can suffer from analysis paralysis — endlessly perfecting the study design while the conservation questions remain unanswered.

                    How does this principle apply to long-term conservation strategies?

                     Long-term strategies still need to start. The principle encourages beginning the first phase of long-term work rather than planning all phases completely before starting anything. Implement phase 1 work packages, learn, adapt phases 2-3 based on what phase 1 reveals. This creates adaptive long-term strategies rather than rigid multi-decade plans that ignore inevitable uncertainty.

                    What's the relationship between "do something" and adaptive management?

                    They're two sides of the same coin. "Do something" gets you started despite uncertainty. Adaptive management provides the processes to learn and adapt once you've started. You need both: starting without adaptive management leads to rigid implementation that fails when assumptions prove wrong. Having adaptive management processes without starting means those processes never get used. Together, they create action-learning cycles that drive conservation success.

                      Related articles

                      • What is conservation project planning?
                      • xxx
                      • xxx
                      • xxx
                      • xxx
                      • xxx
                      • xxx
                      • xxx