A Basel pharmaceuticals SME hired a strategy consultant in 2024 based on a polished pitch deck and a confident handshake. Six months and CHF 90,000 later, the deliverables amounted to recycled templates with the previous client’s logo still visible on slide 14. The company had no structured evaluation process, no reference checks, and no objective scoring. They are not alone: subjective impressions drive advisor selection at most Swiss firms, and the results are predictable. This article provides a structured framework for objectively evaluating advisors, regardless of industry or project type.

Why Objective Evaluation Matters

Subjective selection criteria lead to suboptimal outcomes:

  • Charismatic presentation masks lack of substance
  • Personal rapport is confused with professional competence
  • Low prices supposedly compensate for lack of quality
  • Famous names automatically suggest higher quality

Objective evaluation means:

Define criteria before soliciting proposals, not after. Weight measurable factors higher than soft impressions. Systematic examination instead of gut feeling. Establish comparability between providers.

A structured evaluation framework creates transparency, traceability, and increases the likelihood of choosing the right partner.

The Seven Evaluation Dimensions

complete evaluation occurs along seven dimensions. Each dimension is tested based on specific, measurable criteria.

1. Professional Competence and Expertise

What to examine:

Qualifications:

  • Relevant education, certificates, memberships
  • Industry-specific credentials (e.g., FMH title, ISO certifications)
  • Currency of qualifications (continuing education, recertifications)

Demonstrable Expertise:

  • Portfolio with comparable projects
  • Publications, presentations, professional contributions
  • Degree of specialization (generalist vs. specialist)
  • Technological or methodological depth

Market Experience:

  • How long has the provider been active in the specific niche?
  • What typical challenges does he know from experience?
  • Does he possess industry knowledge (regulatory, cultural, market-specific)?

Examination Methods:

Request documentation (certificates, degrees). Analyse portfolio for relevance and complexity. Ask targeted technical questions in initial conversations. Check whether the provider knows industry trends and developments.

Red Flags:

Missing or outdated certificates. Portfolio with exclusively unspecific or non-comparable projects. Vague answers to concrete technical questions. Generalist approach without recognizable specialization.

2. Methodology and Process Quality

What to examine:

Approach Model:

  • Is there a structured methodology (e.g., PRINCE2, Agile, Design Thinking)?
  • How is the project phased (kickoff, analysis, concept, implementation, completion)?
  • Are milestones and deliverables clearly defined?

Quality Assurance:

  • What internal QA processes exist (reviews, tests, approvals)?
  • Who checks results before delivery?
  • Are there documented standards or checklists?

Project Management:

  • How is progress tracked and communicated?
  • What tools are used (project management software, reporting)?
  • How are changes and scope expansions handled?

Examination Methods:

Have the methodology explained in detail. Request examples of project plans, status reports, or QA checklists. Ask specifically about the last project that went wrong and how processes were adjusted.

Red Flags:

No recognizable methodology (“We do it situationally”). Unclear milestones or deliverables. Missing quality assurance (“The project manager checks it himself”). No documentation of processes or standards.

3. References and Client Satisfaction

What to examine:

Quality of References:

  • Are the reference projects comparable (size, complexity, industry)?
  • How current are the references (not older than 2-3 years)?
  • Are these complete projects or only partial services?

Client Satisfaction:

  • Were project goals achieved?
  • How was the collaboration (communication, reliability)?
  • Were time and cost frameworks adhered to?
  • Would the client engage the provider again?

Long-term Relationships:

  • Does the provider work long-term with clients (indicator of satisfaction)?
  • Are there follow-up projects or maintenance contracts?

Examination Methods (see next section):

Contact at least three reference clients directly (not just written testimonials). Conduct structured interviews. Ask specifically about problems and how they were solved. Check whether references are selectively presented (only the best) or representative.

Red Flags:

No references or only very old ones. References are not comparable to your project. Provider refuses direct reference contacts (“For data protection reasons”). References are exclusively positive without any criticism (unrealistic).

4. Communication and Collaboration

What to examine:

Communication Culture:

  • How quickly and completely are inquiries answered?
  • Is communication clear and understandable (no unnecessary jargon)?
  • Are expectations made explicit or assumed?

Availability:

  • How reachable is the provider (response times, availability)?
  • Are there fixed contact persons or changing contacts?
  • How is communication handled outside regular working hours?

Proactivity:

  • Are potential problems addressed early?
  • Are there regular updates without prompting?
  • Are improvement suggestions proactively contributed?

Examination Methods:

Observe communication already in the selection process. How quickly did the first response come? How completely were your questions answered? Ask reference clients specifically about communication quality. Test availability (e.g., through short-notice inquiry).

Red Flags:

Slow or incomplete answers already in the selection process. Frequently changing contact persons. Vague formulations or evasive answers. Lack of proactivity (reporting only upon request).

5. Cultural Fit

What to examine:

Corporate Culture:

  • Does the provider’s working method fit your culture (e.g., formal vs. agile)?
  • What values does the provider represent (transparency, speed, perfection)?
  • How hierarchical or flat is the organisation?

Working Style:

  • Remote or on-site? Does that match your expectations?
  • How independently does the provider work (much coordination vs. autonomous)?
  • Flexibility with changes (rigid vs. adaptable)?

Team Composition:

  • Who specifically works on the project (not just the salesperson)?
  • How experienced are the actual project members?
  • Does team size fit the task (not over- or understaffed)?

Examination Methods:

Conduct conversations not only with the salesperson but also with project staff. Visit facilities or meet the team personally. Ask about typical workdays or project situations. Observe whether the provider understands your culture and responds to it.

Red Flags:

Strong discrepancy between sales (eloquent) and project team (uncertain). Working style obviously doesn’t fit your organisation. Lack of flexibility (“That’s how we always do it”). No willingness to adapt to your processes.

6. Pricing and Transparency

What to examine:

Price Structure:

  • Is pricing transparent and traceable?
  • Are services itemized in detail or only mentioned in lump sum?
  • How are changes handled (hourly rates, change requests)?

Price-Performance Ratio:

  • Is the price market-standard or significantly above/below?
  • What is included in the price, what not (e.g., travel costs, licenses)?
  • Are there hidden costs?

Payment Terms:

  • How are payment milestones structured?
  • Are there guarantees or warranties?
  • What happens in case of premature project termination?

Examination Methods:

Request detailed cost breakdowns, not lump sum offers. Compare multiple proposals based on the same service description. Ask what happens when scope grows. Check payment terms (e.g., advance payment vs. milestones).

Red Flags:

Intransparent pricing (only total sum, no details). Extremely low prices without apparent reason. High advance payments without service delivery. Unclear or one-sided payment terms. Missing information on change requests.

7. Integrity and Responsibility

What to examine:

Honesty:

  • Are risks and limitations also communicated?
  • Are there unrealistic promises (“Ready in 4 weeks”)?
  • Is there transparent information about service limitations?

Error Culture:

  • How does the provider handle his own mistakes?
  • Are errors admitted and corrected or covered up?
  • Are there examples where projects went wrong and what was learned?

Ethics:

  • Is the provider willing to advise against unsuitable projects (even if it costs revenue)?
  • Are there conflicts of interest (e.g., commissions from third-party providers)?
  • Are clients warned about unrealistic expectations?

Examination Methods:

Explicitly ask about risks and challenges of the project. Deliberately state an unrealistic expectation and observe the reaction. Ask about failed projects and what went wrong. Check whether conflicts of interest are disclosed.

Red Flags:

Only positive presentation without any risks. Unrealistic timelines or budgets without reservations. Lack of willingness to discuss errors or problems. Sales-oriented communication without critical reflection. Conflicts of interest are concealed.

Reference Checking: How to Proceed

References are the strongest indicator of actual performance. Yet many companies check references only superficially or rely on written testimonials. Here’s how to conduct systematic reference checking:

Step 1: Request Reference List

Request at least five references, ideally from the last two years. References should be comparable to your project (size, complexity, industry). Have contact details provided (name, function, phone or email).

Step 2: Select References

Choose three to four references you want to contact. Prioritise references that are particularly relevant. Deliberately also select older references (to check consistency).

Step 3: Conduct Structured Interview

Contact reference clients by phone or video call (not just email). Conduct a structured interview with the following questions:

Project Context:

  • What project was conducted?
  • What was the scope and duration?
  • Who was involved (team, stakeholders)?

Collaboration:

  • How was communication (frequency, clarity, proactivity)?
  • How reliable was the provider (commitments, deadlines, quality)?
  • How were problems or changes handled?

Result:

  • Were project goals achieved?
  • Were time and cost frameworks adhered to?
  • How satisfied are you with the result (scale 1-10)?

Critical Questions:

  • Were there difficulties? If yes, which?
  • What would you do differently next time?
  • Is there anything you view critically in hindsight?

Recommendation:

  • Would you engage the provider again?
  • Would you recommend him unreservedly?
  • For what types of projects is he particularly suitable, for which less so?

Step 4: Evaluate References

Document answers and evaluate them based on the following criteria:

  • Consistency (do all references say the same thing?)
  • Relevance (are the projects comparable?)
  • Critical capacity (are weaknesses also mentioned?)
  • Recommendation willingness (would be engaged again?)

Warning Signs:

All references are exclusively positive (unrealistic). Reference clients are vague or evasive. References contradict each other. Reference clients are unreachable or don’t respond.

Evaluation Matrix: Systematic Assessment

An evaluation matrix helps assess providers systematically and comparably. Define before the tender which criteria are weighted how strongly.

Example Evaluation Matrix

CriterionWeightProvider AProvider BProvider C
Professional Competence25%
- Qualifications10%8/109/107/10
- Portfolio Relevance10%9/107/108/10
- Industry Experience5%7/108/106/10
Methodology & Processes20%
- Structured Approach10%8/109/106/10
- Quality Assurance10%7/108/107/10
References25%
- Reference Quality15%9/108/107/10
- Client Satisfaction10%8/109/108/10
Communication10%
- Clarity & Transparency5%8/107/109/10
- Response Time5%9/108/107/10
Cultural Fit10%
- Working Style5%8/106/109/10
- Team Compatibility5%7/107/108/10
Price-Performance5%
- Transparency3%8/109/107/10
- Market Standard2%7/106/108/10
Integrity5%
- Honesty3%9/108/107/10
- Error Culture2%8/107/108/10
Total Score100%8.17.97.5

How to Use the Matrix

1. Define Weighting in Advance: Determine which criteria are particularly important for your project before soliciting proposals. Adjust weightings accordingly (e.g., higher weighting on methodology for complex projects).

2. Make Criteria Measurable: Define what a 10/10 means (e.g., “Portfolio contains at least 3 comparable projects”). Establish how you evaluate individual criteria (e.g., through reference conversations, document review).

3. Evaluate Independently: Multiple people should evaluate independently. Compare evaluations and discuss deviations.

4. Document Evaluation: Record why you gave a particular rating. Document strengths and weaknesses of each provider.

5. Make Decision: The highest total score is a strong indicator but not necessarily the final decision. Check for knockout criteria (e.g., missing certificates). Also consider qualitative factors (e.g., gut feeling about collaboration).

Proposal Evaluation: Properly Reviewing Offers

Besides evaluating the provider itself, the concrete offer (proposal) must also be systematically reviewed.

Checklist for Proposal Review

Completeness:

  • Are all required contents included?
  • Was the task fully understood?
  • Are all deliverables listed?

Clarity:

  • Is the proposal understandably formulated?
  • Are services clearly described (not “support with” but “creation of”)?
  • Are assumptions and prerequisites explicitly stated?

Level of Detail:

  • Are work packages described in detail?
  • Is it recognizable who does what when?
  • Is there a concrete timeline with milestones?

Feasibility:

  • Is the timeline realistic?
  • Are sufficient resources planned?
  • Were dependencies and risks considered?

Transparency:

  • Are costs itemized in detail?
  • Is it clear what’s included in the price and what’s not?
  • How are changes handled (change management)?

Differentiation:

  • Does the proposal stand out from others (or is it generic)?
  • Does it show specific understanding of your situation?
  • Are there original solution approaches or only standard procedures?

Red Flags in Proposals

Generic Content: Proposal could apply to any company (no specific adaptation). Copy-paste errors (wrong company name, wrong project description).

Unrealistic Promises: Extremely short timelines without apparent justification. Guarantees that cannot be kept (“100% success”). Exaggerated impact promises (“revenue doubles”).

Missing Details: Vague service descriptions (“support”, “consulting”). No concrete deliverables or milestones. Missing information on responsibilities or team composition.

Intransparent Costs: Only total sum, no breakdown. Unclear billing modalities. Missing information on additional costs (travel, licenses).

Red Flags: Recognizing Warning Signs

Some warning signs can only be recognised through direct contact or over time. Watch for these red flags:

In the Selection Process

  • Excessive sales orientation (pressure, “limited offer”, emotional manipulation)
  • Lack of willingness to answer questions or submit documents
  • Frequent delays or short-notice cancellations already in selection process
  • Inconsistent statements (contradictory information in different conversations)
  • No time for thorough analysis (“We can start immediately”)

With References

  • References are unreachable or don’t respond
  • References are vague or evasive
  • Only very old references (older than 3 years)
  • No willingness to name references

In Communication

  • Slow or incomplete answers to concrete questions
  • Evasive formulations instead of clear answers
  • Lack of proactivity (information only upon request)
  • Frequently changing contact persons

In Pricing

  • Extremely low prices without apparent reason
  • Intransparent or unclear cost structure
  • High advance payments before service delivery
  • Hidden costs that appear later

In Methodology

  • No recognizable structure or methodology
  • “We do it situationally” (lack of standardization)
  • No quality assurance or reviews
  • Missing documentation

Properly Evaluating Cultural Fit

Cultural fit is difficult to measure but crucial for project success. Here’s how to examine systematically:

Dimensions of Cultural Fit

Working Style:

  • Structured vs. flexible
  • Formal vs. informal
  • Process-oriented vs. results-oriented
  • Hierarchical vs. flat

Communication Culture:

  • Direct vs. diplomatic
  • Detailed vs. overview-oriented
  • Written vs. verbal
  • Synchronous (meetings) vs. asynchronous (email)

Decision-Making:

  • Consensus-oriented vs. top-down
  • Data-driven vs. intuitive
  • Fast vs. thorough
  • Participative vs. delegative

Error Culture:

  • Open vs. defensive
  • Learning-oriented vs. blame-assigning
  • Transparent vs. concealing

Examination Methods

Observation: How does the provider behave in the selection process? How does he communicate (tone, formality)? How structured are documents and presentations?

Targeted Questions: “How do you make decisions in projects?” “How do you handle errors or problems?” “What does a typical workday look like for you?” “How much coordination do you expect from us?”

Team Meeting: Organise a joint meeting with both teams. Observe interaction, dynamics, and compatibility.

Reference Feedback: Ask reference clients explicitly about cultural fit. “How would you describe the working style?” “How well did the collaboration work culturally?”

When Fit Doesn’t Work

Cultural incompatibility leads to friction, misunderstandings, and frustration. Even if the provider is technically excellent, poor cultural fit can jeopardize project success.

Decision Options:

If fit isn’t perfect but tolerable, define explicit ground rules (e.g., communication frequency, decision processes). If fit fundamentally doesn’t work, choose another provider even if technically weaker.

Let the System Decide

Objective evaluation of advisors and service providers is not chance but the result of a structured process:

1. Define Evaluation Dimensions Establish before the tender which criteria are relevant and how strongly they’re weighted.

2. Use Measurable Criteria Define how you can objectively evaluate individual criteria (e.g., through reference conversations, portfolio analysis).

3. Systematically Check References Contact at least three reference clients and conduct structured interviews.

4. Use Evaluation Matrix Evaluate all providers based on the same criteria and document your assessments.

5. Review Proposals in Detail Evaluate not only the provider but also the concrete offer for completeness, clarity, and feasibility.

6. Take Red Flags Seriously Warning signs already in the selection process are indicators of later problems.

7. Consider Cultural Fit Even the technically best provider is unsuitable if collaboration doesn’t work culturally.

A structured evaluation framework increases the likelihood of finding the right partner. It creates transparency, comparability, and traceability. And it minimises the risk that subjective impressions or charismatic presentations distort the decision.

Invest time in selection, it pays off in project implementation.


Transparency Note: This article was created with the support of AI technology (Claude, Anthropic) and editorially reviewed. The content is based on established evaluation methods and best practices in advisor and service provider selection.