The AAGI Collaborator Survey

Service & Support
Surveys
Step-by-step guide to sending collaborator surveys, generating personalised links, and collecting useful feedback.
Author

Rose

Design Principles and Purpose

Objectives and scope

The AAGI Collaborator Survey is designed to help us understand the impact our Service & Support activities have within the Australian grains research development and extension (RD&E) sector. It also provides collaborators with a straightforward way to provide feedback to support our ongoing improvement. The survey in its current form is intended for people or groups who receive experimental design or analysis outputs as part of an AAGI Service and Support collaboration.

Each time these outputs are delivered, a survey request should accompany them. A new survey link must be generated for each delivery to reflect the recipient, the AAGI node, and the specific output provided. Customising the link in this way ensures participants see only questions relevant to them, are not asked to provide information we already hold, and that their responses are linked with the correct metadata. Accurate link generation is therefore an essential step in making the survey feel considered, concise, and relevant to each participant.

Psychological foundations of survey design

A fundamental principle of effective survey design is that participants’ experience of the survey cannot be separated from the reliability and validity of the data it produces. Unlike measurements of physical phenomena, where reducing subjectivity moves observations closer to an objective signal, subjectivity in survey data is an essential feature of what surveys are designed to capture. Surveys seek to measure personal meaning, interpretation, and experience, which cannot be accessed through instrumental measurement alone. As a result, survey data quality depends not only on standard error reduction techniques, but also on preserving the authenticity of participants’ responses. Any feature of the survey that causes responses to deviate from a participant’s genuine perception or experience introduces noise into the data. Because responses are shaped by an individual’s social, cognitive, physical, and contextual conditions, including the survey itself, careful consideration of how the survey influences respondents is a central component of effective survey design.

Research shows that people interpret and respond to surveys using the same social norms and expectations that guide interpersonal interaction. Participants naturally read question wording, structure, clarity, and the overall care evident in a survey’s design as cues about the intentions, competence, and respect of those seeking their input. These cues shape responses in ways that tend to mirror the level of care, effort, and respect perceived. In this way, surveys operate simultaneously as measurement tools and interpersonal encounters. The relational qualities inferred by participants become part of the data-generating process. Combining behavioural insight with technical precision to attend to the survey experience as a whole is therefore essential.

The problem of survey burden

A key mechanism through which experience affects data is survey burden. Completing a survey requires an investment of time, attention, and cognitive effort. Participants’ evaluation of this investment relative to the value they believe their responses will produce constitutes burden. Survey burden is both an ethical imperative, rooted in beneficence and respect, and one of the strongest determinants of response behaviour. As perceived burden increases, non-response rises, satisficing becomes more common, and measurement error grows, all of which reduce the completeness and reliability of the dataset.

Survey burden also shapes how participants view the researchers or organisations administering the survey, eroding trust and reducing willingness to cooperate. These effects can compound over repeated measures, causing both relationships and data quality to decline over time. Because negative experiences tend to generalise, these effects can extend beyond the original survey and group to other forms of engagement and the broader research community. A key component of effective survey design is therefore doing as much as possible to minimise effort while maximising value.

Minimising avoidable effort

Unclear or irrelevant questions, poor structure, excessive length, barriers to accessibility or inclusiveness, and wording or style that doesn’t align with the target group are all examples of avoidable effort. Survey designs that impose effort unnecessarily implicitly signal low regard for participants’ time, knowledge, and contributions, undermining any expectation that input will lead to meaningful improvements. Taking steps to minimise avoidable effort, including careful generation of accurate survey links, is a vital part of this process.

Maximising value

Maximising value is equally important. This includes considering how insights will be shared or applied in meaningful ways. When participants see that their input leads to clearer communication, visible improvements, or thoughtful reflection, the value of their contribution becomes evident, making participation feel worthwhile. Because this survey will be repeated with each output delivery, maintaining a positive relational experience is key to ensuring reliable data and strong professional relationships.

If you notice any issues or have suggestions that could improve the structure, wording, or workflow of the survey, or if you have ideas about how we might act on our findings and present or communicate them along with our actions with collaborators in the future, please share them in an email using the subject line ‘AAGI Impact Survey’ to CBADA@curtin.edu.au.

Who should send surveys

The AAGI staff member delivering the associated output should send the surveys. This ensures clarity on which specific output and project the survey references, and receiving a participation request from someone they have a connection with can help increase engagement and response rates.

Timing for survey delivery

Feedback is most accurate when it is collected while the experience or work is still fresh in participants’ minds. One option is to include the survey link in the same email as the outputs. This allows recipients to return to the survey after they have had time to use the work, while clearly linking it to the specific outputs being evaluated. Ultimately, it is up to you and your team to decide what works best for your workflow, but survey links should be sent no more than seven days after output delivery.

When sending multiple outputs at the same time, a single survey can be used if all the outputs are of the same type and focus, such as several analyses of small plot trials for a single season. This approach can reduce survey burden and support positive working relationships. However, if different output types, analytical focuses, or design types are delivered together, separate surveys should be used to ensure questions remain relevant to each output. Surveys should not be delayed until the end of a project and combined into a single request, as this reduces the accuracy and usefulness, of the data.

Installing the {AAGISurvey} package

The {AAGISurvey} package is hosted in the AAGI R-Uiverse and it contains the create_survey_url() function developed to help streamline generating the correct URL for each output delivery.

Step 1: Enable the AAGI R-Universe

To install the package, you first need to enable the AAGI R-Universe in your R session. You only need to do this once. If you have already enabled the AAGI R-Universe for another package, you can skip this step.

The URL for the AAGI R-Universe can be found via the AAGI-Aus GitHub.

Once you have the URL, you can enable the universe using:

# example syntax for enabling the AAGI R-Universe
options(repos = c(   
  aagi_aus = "<AAGI R-Universe URL>",   
  CRAN = "https://cloud.r-project.org" 
  )) 

Step 2: Install the package

install.packages("AAGISurvey") 

Step 3: Load the package

library(AAGISurvey)

Once installed, you only need library(AAGISurvey) in future sessions.

Generating the correct survey URL

When sending a survey to request feedback from a partner, its important that the survey link is generated specifically for the intended recipient. If the link is not customised, participants are asked to provide information about:

  • the type of support received
  • the output type
  • which AAGI node provided the support, and
  • the type of organisation they are affiliated with

Asking collaborators to provide information that we already have imposes unnecessary burden, undermines the ethical principles of respect and beneficence, and threatens response quality, completeness, and engagement.

The create_survey_url() function is designed to make generating the correct survey link for each recipient simple and reliable. By providing the function with the details relevant to your recipient, it produces a URL with this information embedded directly in it.

Embedding these details in the survey URL achieves two key purposes:

  1. Reduces participant burden while capturing essential metadata – participants are not asked to provide information the team already knows, but the data is still recorded alongside their other responses, making it possible to analyse patterns across groups.
  2. Controls question display – the embedded information also determines which questions are shown, so that participants only see relevant items. This improves the survey experience and helps ensure that the data collected is accurate, relevant, and actionable.

You can use the function in two ways. You can supply all required arguments directly in your script (scripted), or you can run the function interactively, in which case it presents a menu in the console that guides you through each field. Once all the necessary information has been provided, the function returns the completed survey URL, prints a short summary of the details you supplied, and copies the URL to your system clipboard for easy inclusion in an email or other communication.

Step 1. Run the function

You can run create_survey_url() in one of two ways:

  • Scripted – provide all required arguments directly in your script.
  • Interactive – run the function with no arguments and follow the prompts in the console.

Scripted example:

library(AAGISurvey)

# Create a survey URL for design support for small plot trial for a government agency carried out by AAGI Cu

url <- create_survey_url( 
  support_type = "S_D", 
  design_type = "D_SP", 
  analysis_type = ,
  aagi_node = "CU", 
  organisation_type = "O_GRO" 
  )

Interactive example

library(AAGISurvey)

url <- create_survey_url() 

Both approaches will:

  • Generate a Qualtrics survey URL containing the correct metadata
  • Print a summary of your selections
  • Copy the URL automatically to your clipboard for easy pasting into an email or other message

Viewing full documentation

To view full documentation for create_survey_url(), including code descriptions and examples, press F1 in RStudio or type the following in any R session:

?create_survey_url 
# or
help("create_survey_url")

Step 2. Verify information

Carefully check the printed summary to ensure all details match the collaborator and the outputs provided.

Step 4. Schedule sending (if not included with outputs)

If you are not including the survey request in the same email as the outputs, schedule the email to be sent during work hours and within seven days of output delivery.

Scheduling in Outlook:

  • Windows Desktop: Options → Delay Delivery → Do not deliver before → set date/time → Close → Send
  • Mac Desktop: Arrow next to Send → Send Later → select date/time → Send
  • Outlook Web: Arrow next to Send Later → pick date/time → Send

Step 5. Save a copy of the email for project records

Once the email is drafted or sent, save a copy in the ’Correspondence’ folder of the associated collaboration project on the R-drive. Saving a copy ensures a complete record of the communication is kept in the project folder for future reference and accountability.