#ebf2eb

Research that provides direction

For us, research is not a tool, but a mindset. It reveals where needs lie, where friction arises, and makes impact measurable. Not as an end in itself, but as a foundation for good decisions.

Why Research

Because assumptions aren't enough

Good products start with the right questions. What motivates people to use or avoid a product? What makes an experience truly relevant? Research helps to find answers. So that decisions are based on evidence instead of assumptions.

Why Research

Because assumptions aren't enough

Good products start with the right questions. What motivates people to use or avoid a product? What makes an experience truly relevant? Research helps to find answers. So that decisions are based on evidence instead of assumptions.

Our research mindset

Understanding means questioning your own perspectives

We observe before we interpret. We test before we evaluate. We learn before we design. This is how we approach every task. With the method that fits the goal, not because it is convenient, new, or loud.

Our research mindset

Understanding means questioning your own perspectives

We observe before we interpret. We test before we evaluate. We learn before we design. This is how we approach every task. With the method that fits the goal, not because it is convenient, new, or loud.

Empowering teams

Research as a shared foundation

Good research creates a common orientation. It reveals user reality before design and development begin.

At UseTree, research is developed collaboratively, coordinated with stakeholders, and documented in a way that is understandable to all teams. These efforts result in a shared knowledge base that accelerates decision making and reduces risks.

Empowering teams

Research as a shared foundation

Good research creates a common orientation. It reveals user reality before design and development begin.

At UseTree, research is developed collaboratively, coordinated with stakeholders, and documented in a way that is understandable to all teams. These efforts result in a shared knowledge base that accelerates decision making and reduces risks.

What research is capable of achieving

Understanding people

Identify needs, sharpen personas and derive requirements from them

Differentiate brands

Analyze target groups, strengthen brand profile and thus ensure relevance

Reduce costs

Avoiding undesirable developments through evidence based decisions

Increase satisfaction

Create experiences that meet expectations and build trust

Guide decisions

Validate ideas, set priorities, and invest where it counts

Reduce risks

Identify user problems early on before they become expensive.

"I believe in innovation and that the way you get innovation is you fund research and you learn the basic facts."

Bill Gates

Our process: Research across the entire product life cycle

Research is not a single phase. It accompanies digital products from the initial idea to ongoing operation and beyond.

Depending on the phase your product is in, research takes on different tasks. Here's how we apply research – targeted, evidence-based, and impact-focused:

Create understanding

It all starts with understanding. We explore usage contexts, needs, and expectations before developing solutions.

Goal: A well founded basis for strategic and conceptual decisions.

Requirements Catalog

Description:

A product can only be successful if it is clear what it is intended to achieve. In the Requirements Catalog, we document which functions, content, and conditions are relevant for use—from the perspective of both the user and the system. These requirements are systematically gathered, prioritized, and evaluated based on their current status of fulfillment. This creates a reliable foundation upon which the information architecture, functional scope, and subsequent development steps can be purposefully built.

Your input:

  • Insight into user groups and their typical tasks
  • Relevant existing insights such as personas or journey maps

Our result:

  • Structured and prioritized catalog of requirements
  • Visualization of gaps and level of compliance
  • Basis for downstream information architecture or design decisions
Let's Talk

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspend Varius Enim in Eros Elementum Tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean Faucibus Nibh et Justo Cursus id Rutrum Lorem Imperdiet. Nunc ut sem vitae risus tristique posuere.

User Journey Mapping

Description:

A good product is created when we make the reality of use visible. User journey mapping shows how people navigate through a digital offering, where they progress, and where they drop out. We view each point of contact not as a step in a process, but as a moment with an impact. This makes it clear what expectations exist and where potential lies. The journey becomes the basis for decisions on setting priorities and using resources effectively.

Your input:

  • Insight into product context and relevant touchpoints
  • Access to target group or user contacts

Our result:

  • Visualized journey map with pain points and opportunities
  • Conducting and evaluating user interviews
  • Recommendations for product optimization
Let's Talk

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspend Varius Enim in Eros Elementum Tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean Faucibus Nibh et Justo Cursus id Rutrum Lorem Imperdiet. Nunc ut sem vitae risus tristique posuere.

Netnography

Description:

People speak openly when they are among themselves. Netnography uses exactly this dynamic: We analyze contributions and discussions in digital spaces to make real user behavior visible. We are not interested in what is being said about products, but in why people speak and act that way. This shows needs, emotions and expectations before they can be measured in figures. The insights gained create a realistic understanding of target groups — unfiltered and close to the world around them.

Your input:

  • Existing insights into user groups and their context
  • Topic focus or product context
  • Relevant platforms or communities

Our result:

  • Evaluation of qualitative user data from digital sources
  • Deriving key needs, emotions and motivations
  • Product and communication strategy recommendations
Let's Talk

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspend Varius Enim in Eros Elementum Tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean Faucibus Nibh et Justo Cursus id Rutrum Lorem Imperdiet. Nunc ut sem vitae risus tristique posuere.

Competitive Analysis

Description:

Those who design products should understand the expectations shaped by the market. In our Competitive Analysis, we examine relevant competitor products from a user perspective, analyzing the solutions they offer, where they excel, and where gaps become apparent. This perspective is not for the sake of imitation, but for clarity: identifying which expectations are already established, which opportunities remain untapped, and where your product can intentionally take a different approach to create genuine added value.

Your input:

  • Selection of relevant competitive products or comparative offers
  • Insight into target groups or market segments

Our result:

  • Evaluation of competitive offers from a usage perspective
  • Visible gaps and differentiation potential
  • Derivation of strategic starting points for positioning and product development
Let's Talk

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspend Varius Enim in Eros Elementum Tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean Faucibus Nibh et Justo Cursus id Rutrum Lorem Imperdiet. Nunc ut sem vitae risus tristique posuere.

Deriving hypotheses

Findings become assumptions that can be verified.
We structure observations, identify patterns and formulate hypotheses.

Goal: Clarity about what is relevant and needs to be tested.

Kano-Ranking

Description:

Products become relevant when they meet real needs. In our needs analysis, we examine which features are truly meaningful to users and which only offer assumed value. To do this, we systematically capture expectations, mandatory criteria, and potential delighters, revealing how significantly they influence perceived quality. This provides clarity on which functions take priority—ensuring that development creates impact rather than just volume.

Your input:

  • Overview of existing and planned features

Our result:

  • Structured evaluation of needs and functional performance
  • Prioritize according to potential benefits and impact
  • Recommendations for focused product development
Let's Talk

Online Research Communities

Description:

Online research communities enable long-term exchange with real usage prospects. Using a digital platform, selected people regularly provide insights, respond to impulses and further develop their views in dialogue. The continuous exchange makes it clear how needs are changing, which topics are becoming more important and where concrete starting points for product development are emerging.

Your input:

  • Clear thematic focus or research interest
  • Access or recruitment of suitable participants

Our result:

  • Ongoing collection of user-driven perspectives and feedback
  • Elaborated patterns and relevant topic clusters
  • Derivatives that gain substance over the course
Let's Talk

Customer Experience Workshop

Description:

Good decisions are made when everyone shares the same vision of the user reality. In the Customer Experience Workshop, we visualize how users interact with the product, the expectations they bring, and where friction occurs. Together with stakeholders, we structure experiences, pain points, and opportunities across key touchpoints. This creates a shared understanding of where the product must make an impact—and the requirements that follow.

Your input:

  • Knowledge about relevant user groups
  • Participation of key stakeholders in the workshop

Our result:

  • Joint customer experience map with touchpoints, pain points and opportunities
  • Derived requirements and focus areas for product development or service design
  • Clarity about relevant user experiences as a basis for decision-making
Let's Talk

Requirements Catalog

Description:

A product can only be successful if it is clear what it is intended to achieve. In the Requirements Catalog, we document which functions, content, and conditions are relevant for use—from the perspective of both the user and the system. These requirements are systematically gathered, prioritized, and evaluated based on their current status of fulfillment. This creates a reliable foundation upon which the information architecture, functional scope, and subsequent development steps can be purposefully built.

Your input:

  • Insight into user groups and their typical tasks
  • Relevant existing insights such as personas or journey maps

Our result:

  • Structured and prioritized catalog of requirements
  • Visualization of gaps and level of compliance
  • Basis for downstream information architecture or design decisions
Let's Talk

Check quality

Concepts and prototypes are reviewed before they go live.
Research reduces risks before they become expensive.

Goal: Ensuring quality and avoiding undesirable developments.

Usability/UX Testings

Description:

Good user experiences are not created at a desk, but in direct contact with real users. In the UX test, we observe how people handle the product, where they act intuitively and where uncertainty occurs. Through tasks, free exploration and targeted inquiries, we make it clear how comprehensible the product is and where support is needed. In this way, usability risks are identified early on — before they become expensive later on.

Your input:

  • Access to potential users and, if necessary, incentives
  • Joint coordination of test priorities
  • Provision of product or prototype to prepare

Our result:

  • Weighted list of key usability weaknesses
  • Specific recommendations to improve the user experience
  • Clear focus on areas with the highest risk or biggest impact
Let's Talk

Unmoderated usability/UX test

Description:

When many perspectives count, the unmoderated UX test offers a quick and efficient way to capture user experiences in the field. Users go through defined tasks independently and give their impressions via questionnaires and open answers. Meanwhile, their interactions are automatically recorded and enable evaluations such as click path analysis or abort behavior. This creates a broad picture of how comprehensible the product is — without moderation, but with a clear depth of knowledge.

Your input:

  • Access to potential users and, if necessary, incentives
  • Coordination of test priorities and task definition
  • Provision of prototype or product to prepare

Our result:

  • Weighted list of key usability vulnerabilities
  • Derivation of recurring usage patterns and risks
  • Specific recommendations for optimising user navigation
Let's Talk

Benchmark Analysis

Description:

A benchmark analysis shows how your own website is perceived in comparison to other offerings. We examine competitor sites from the user’s perspective and evaluate which solutions provide orientation, build trust, or clarify functionality. This is not about imitation, but about insight: Where does your own solution require action, and which best practices demonstrate effective success? The analysis provides clarity on where optimization is worthwhile and which approaches hold real substance.

Your input:

  • Insight into the market environment and relevant objectives
  • Joint clarification of comparative criteria

Our result:

  • Evaluated comparison of your own solution and standard market approaches
  • Visible need for action in user experience
  • Inspiration from working solutions from other providers
Let's Talk

Expert Review

Description:

An expert review shows how well a product works from the perspective of experienced UX professionals. We check whether user guidance, language and interface logic provide orientation or create confusion. In doing so, we are guided by established usability and design principles without applying them dogmatically. Instead of processing lists, we look at what the user experience feels like and where there is friction. The result: clear assessments, quick wins and targeted recommendations that significantly increase product quality.

Your input:

  • Joint coordination of key usage scenarios
  • Access to the product or prototype
  • Optionally available functional descriptions or documentation

Our result:

  • Analysis of key strengths and weaknesses in the user experience
  • Identification of quick wins and structural improvement potential
  • Specific recommendations to increase clarity, consistency, and usability
Let's Talk

Cognitive Walkthrough

Description:

A system is intuitive when people take action without guidance. In a Cognitive Walkthrough, we test exactly that: we adopt the perspective of inexperienced users and follow their potential paths through the interface. Step by step, we evaluate whether actions are understandable, labels provide orientation, and feedback guides the user effectively. This reveals where mental models diverge—and where user guidance can become more precise, clear, and easy to learn.

Your input:

  • Central tasks or typical usage scenarios
  • Access to the system or prototype

Our result:

  • Documented evaluation of interaction steps
  • Identified breakpoints in the user experience
  • Recommendations to improve comprehensibility and learnability
Let's Talk

Accessibility consulting

Description

Accessible products are created when design, language, and technology are conceived inclusively from the start. In our accessibility consulting, we support the development of digital offerings to ensure they are barrier-free—for all target groups and in every usage situation. We examine design, content, and code for potential obstacles and demonstrate what inclusive design looks like in practice. The goal is an experience that excludes no one and remains sustainable over time.

Your input:

  • Insight into product status and existing design or code base
  • Joint clarification of target groups and accessibility requirements

Our result:

  • Identified barriers in design, interaction, and language
  • Specific recommendations to improve accessibility
  • Basis for inclusive development in design and implementation
Let's Talk

Measuring impact

What really happens becomes apparent during operation.
We combine qualitative findings with quantitative data.

Goal: Understand real user behavior and improve it in a targeted manner.

UX Curve

Description:

The UX Curve shows how the perception of a product changes over time. We look at usage from a retrospective perspective and identify moments when the product was successful or caused irritation. This does not result in a static judgment, but rather a progression that shows how trust, frustration, or satisfaction develop over prolonged use. The method highlights which aspects are successful in the long term and where recurring points of failure lie.

Your input:

  • Access to people with experience using the product
  • Joint coordination of the observation period

Our result:

  • Making user experience visible over time
  • Identified high and low points from a user's perspective
  • Conclusions for sustainable product improvement
Let's Talk

Potential-Workshops o. Interviews

Description:

In potential workshops or interviews, we consciously look ahead and talk to people who use or will use the product. This is not about current problems, but about opportunities: What support would be desirable, which new functions could offer real added value and what are the unexpressed expectations? The method makes it possible to see the future potential of the product — seen from a real usage perspective.

Your input:

  • Clarifying the thematic or future framework
  • Insight into typical usage scenarios or requirements

Our result:

  • Collection of future-oriented needs and ideas from a user's perspective
  • Identified areas of potential for further development
  • Orientation as to which opportunities can be pursued strategically
Let's Talk

Design Thinking Workshop

Description:

A Design Thinking Workshop brings together different perspectives in order to develop solutions close to real needs rather than abstractly. Together, we understand the problem from a user's point of view, make potential visible and translate ideas into initial tangible approaches. The focus is not on creativity as an end in itself, but on clarity: Which solution idea has substance, has an impact and is worthwhile to pursue.

Your input:

  • Common understanding of challenge or subject area
  • Involving relevant project participants

Our result:

  • Shared view of problem space and user perspective
  • First solutions with a discernible focus on benefits
  • Basis for deciding which ideas should be further developed
Let's Talk

Retrospective Analysis

Description:

Good processes are created when experience is reflected. In Retrospective Analysis, we look back at the course of the project together and make visible what has had an effect and where friction has arisen. The goal is not evaluation, but learning: Which decisions have supported or slowed down, and what is needed so that future projects can run more clearly, more focused and more effectively. In this way, experience becomes creative potential.

Your input:

  • Willingness to reflect openly
  • Involving relevant project participants

Our result:

  • Common view of effective and hindering factors in the course of the project
  • Specific conclusions for future collaboration and processes
  • Documented learnings as a basis for continuous improvement
Let's Talk

Securing learning

Needs are changing. Products too.
We help organizations to anchor research sustainably.

Goal: Learning teams and continuous development.

Usability/UX Testings

Description:

Good user experiences are not created at a desk, but in direct contact with real users. In the UX test, we observe how people handle the product, where they act intuitively and where uncertainty occurs. Through tasks, free exploration and targeted inquiries, we make it clear how comprehensible the product is and where support is needed. In this way, usability risks are identified early on — before they become expensive later on.

Your input:

  • Access to potential users and, if necessary, incentives
  • Joint coordination of test priorities
  • Provision of product or prototype to prepare

Our result:

  • Weighted list of key usability weaknesses
  • Specific recommendations to improve the user experience
  • Clear focus on areas with the highest risk or biggest impact
Let's Talk

Unmoderated usability/UX test

Description:

When many perspectives count, the unmoderated UX test offers a quick and efficient way to capture user experiences in the field. Users go through defined tasks independently and give their impressions via questionnaires and open answers. Meanwhile, their interactions are automatically recorded and enable evaluations such as click path analysis or abort behavior. This creates a broad picture of how comprehensible the product is — without moderation, but with a clear depth of knowledge.

Your input:

  • Access to potential users and, if necessary, incentives
  • Coordination of test priorities and task definition
  • Provision of prototype or product to prepare

Our result:

  • Weighted list of key usability vulnerabilities
  • Derivation of recurring usage patterns and risks
  • Specific recommendations for optimising user navigation
Let's Talk

Accessibility testing

Description:

Accessibility testing shows whether barrier-free design is not just planned, but truly experiential. Together with individuals who rely on assistive technologies or low-barrier design, we examine how effectively the product can be used. This is not merely about technical compliance, but about genuine accessibility within the user experience. These tests reveal where people encounter obstacles and which adjustments are necessary to ensure that usage is possible without hurdles.

Your input:

  • Access to the product or prototype
  • Insight into target groups with specific accessibility needs

Our result:

  • Observations from tests with real use situations
  • Obstacles made visible in interaction and comprehensibility
  • Specific recommendations for improving accessibility in everyday product life
Let's Talk

A/B Testing

description

A/B Testing reveals which variant performs better within the actual context of use. Two versions—featuring different wording or layouts, for example—are deployed simultaneously during live operation, making their impact on behavior directly comparable. We don't view these results as isolated figures, but rather in relation to the overall user experience. This provides a clear basis for deciding which variant creates clarity and offers the most reliable support.

Your input:

  • Coordination of variants to be tested or questions
  • Access to the product or corresponding interface

Our result:

  • Data-based comparison of variants in a real context of use
  • Visualized effects on behavior or interaction
  • Recommendation as to which variant should be continued or optimized
Let's Talk

User Cohort Analysis

Description:

Not all users experience a product in the same way—nor at the same time. In user cohort analysis, we look at different user groups over a defined period of time and reveal how behavior and interaction develop. This shows which features are successful in the long term, where interest wanes, and where new patterns emerge. Looking at data over time creates a basis for understanding developments not just at specific points in time, but over the course of time.

Your input:

  • Common understanding of the relevant user groups or time periods
  • Access to the product or prototype for analysis

Our result:

  • Identification of usage trends within defined groups
  • Visible changes in behavior over time
  • Indications of which areas show stability and where there is a need for adjustment
Let's Talk

Expert Report

Description:

An expert opinion (usability report, accessibility test, UX report, usability report, accessibility report) creates reliability for product decisions. We check whether a product is suitable for use or barrier-free use and are guided by recognized standard criteria. It is assessed how well these requirements are met in a real context of use. This makes it possible to see what is already wearing and where gaps still exist. The result is verifiable evidence that provides guidance for further development.

Your input:

  • Common understanding of the context of use
  • Coordination of audit priorities and target groups
  • Access to the product or prototype

Our result:

  • Written, signed report including verifiability of the evaluation
  • Documentation of compliance with usability or accessibility requirements
  • Specific starting points for standard-compliant development
Let's Talk

Analytics & Heatmaps

Description:

Data shows where a product guides users and where they encounter obstacles. Using analytics and heatmaps, we observe authentic user behavior directly within the application. Click paths, drop-off points, and areas of high engagement become visible, providing insights into how effectively interaction and orientation are functioning. These figures do not stand alone; instead, they help us understand where the product provides support and where design must create greater clarity.

Your input:

  • Common understanding of desired areas of knowledge
  • Access to the product or prototype for analysis

Our result:

  • Making usage patterns and areas of interaction visible
  • Identified breakpoints in click or scroll behavior
  • Derivatives for targeted optimization of user navigation
Let's Talk

Customer Support Analysis

Description:

Support inquiries reveal a great deal about where a product provides clarity—and where it does not. Through Customer Support Analysis, we evaluate real user requests to identify patterns, recurring friction points, and latent needs. We do not view support as a source of problems, but as a gateway to the authentic user experience. This makes it clear where people seek assistance and how the product must be designed so that help becomes unnecessary in the first place.

Your input:

  • Access to support requests or ticket system
  • Classification of the product context and relevant target groups

Our result:

  • Structured evaluation of recurring problems
  • Identifiable causes behind the inquiries
  • Specific starting points to reduce support requirements through better product management
Let's Talk

Why UseTree when it comes to research

Good decisions require understanding.
For us, research is not an end in itself, but the basis of responsible product development.

We do research to test assumptions and make real use visible.
So that decisions become reliable.

DOS

What we do

  • We combine empathy with evidence.
  • We make user reality visible — before development begins.
  • We provide insights that enable decisions.
  • We make research usable and manageable for teams.
  • We don't just measure, we understand.

Dont's

What we don't do

  • No methodology bingo
  • No research without a clear question
  • No interviews without evaluation
  • No data without context
  • No results without consequences

Our Conclusion

Good research replaces opinions with knowledge.

And makes decisions sustainable.

Do you have any questions?

Let's discuss your research project and work together to create clarity, direction, and impact.

Dr. Anna Trukenbrod

Dr. Anna Trukenbrod

research@usetree.com
+49 30 8632919-1

Explore more

Let's Talk

Thank you very much

... that you've reached out to UseTree.
Our team will be like this
with you as soon as possible
get in touch.

Write another message
Yikes! Something went wrong. Try again.