Methodological details of monitoring
Which accessibility criteria were checked and how the selection of subpages or screens for the checks was made
Most of the checks were carried out in cooperation with agencies commissioned for this purpose. In Vienna, the simplified checks were carried out by internal experts, in Tirol, all checks were carried out by internal experts.
Overall, a uniform audit and reporting procedure was sought for all three types of checks for the whole of Austria, which could be achieved as far as possible.
“The transformation to the digital society needs accessibility as a basis for democracy and equality. “Leave no one behind” must not remain a catchphrase, but needs not only laws and standards but also monitoring, quality assurance and documentation of progress. This can only be guaranteed by a professional framework such as that provided by the FFG.”
In-depth checks websites
For the in-depth checks of the websites, all requirements listed in Table A.1 of Annex A of EN 301 549 V3.2.1 (2021-03) (PDF) (external link) were checked. For documents (PDF documents or similar), the criteria set out in Chapter 10 of EN 301 549 V3.2.1 (2021-03) were checked. These largely correspond to the WCAG success criteria of version 2.1, conformity level A and AA. A detailed table of equivalents for unfolding is available below.
According to EN 301 549, the following WCAG criteria are specifically excluded for the check of documents:
- 2.4.1 Bypass blocks
- 2.4.5 Multiple ways
- 3.2.3 Consistent navigation
- 3.2.4 Consistent identification
All websites checked in detail were checked for compliance with the WCAG criteria by means of detailed manual checks. Blind and visually impaired testers were involved in the checks.
"Comprehensive accessibility is the basis for a self-determined and equal-opportunity life. Especially in the digital sector, there is great potential to improve processes for blind and visually impaired people – but unfortunately, the opposite is often the case. That is why it is important to involve experts in the planning phases in order to exclude barriers and errors from the outset. For existing websites and applications, errors can be detected and subsequently improved with the help and expertise of blind screen reader users. Integrating them into accessibility checks is therefore a safe path towards digital accessibility!”
The checks were based on the WCAG-EM (Website Accessibility Conformance Evaluation Methodology) (external link). For this purpose, the following test steps were carried out:
- Use of analysis tools for a first impression of the structure and structure of the pages
- Visual and auditory examination (contrasts, examination of multimedia content, etc.)
- Analysis of the source code
- Verification of keyboard operability
- Verification of gesture and motion control (multipoint or path-based gestures, alternatives to control by moving the device)
- Testing with screen readers
The following tools were used to support this:
- QualWeb, W3C Markup Validation Service, Colour Contrast Analyser (CCA), Text Spacing Bookmarklet, Taba11y
- Screenreader: NVDA, TalkBack and VoiceOver, JAWS (in Tyrol)
For the additional criteria from EN 301 549, it was checked whether an error or no errors could be found for the respective criterion. Not every, possibly existing error was documented, but only the first error found, or, if no error was found in total, the criterion was rated as "fulfilled".
Sampling of the subpages to be checked
For the incoming checks, the following subpages of each website were selected for review (if available):
- Homepage, registration, sitemap, contact, help and pages with legal information
- At least one relevant page for each type of service provided by the website and for each other main purpose including search function
- The pages with the accessibility statement or information and the pages with the feedback mechanism
- Five sample pages with a distinctly different look and feel or different types of content
- At least one relevant retrievable document, if any, for each type of service provided by the website and for each other main purpose
- Randomly selected pages of at least 10% of the sample defined by the previous steps
If any of the subpages selected above contains a step in a procedure, all steps of the procedure are checked. This means that all the subpages necessary to go through the entire process are checked.
Correlation table in-depth checks websites
Clauses from EN 301 549 with the number 9 refer to classic websites, clauses with the number 10 to documents. In addition, further criteria from EN 301 549 were reviewed. Note: Appropriate tools were used to support the manual checks carried out.
WCAG Success Criteria | Clause in EN 301 549 | Checks |
---|---|---|
- | 5.2 Activation of accessibility features | Manual check |
- | 5.3 Biometrics | Manual check |
- | 5.4 Preservation of accessibility information during conversion | Manual check |
- | 6.1 Audio bandwidth for speech | Manual check |
- | 6.2.1.1 RTT communication | Manual check |
- | 6.2.1.2 Concurrent Voice and Text | Manual check |
- | 6.2.2.1 Visually Distinguishable Display | Manual check |
- | 6.2.2.2 Programmatically Determinable Send and Receive Direction | Manual check |
- | 6.2.2.3 Speaker Identification | Manual check |
- | 6.2.2.4 Visual Indicator of Audio with RTT | Manual check |
- | 6.2.3a Interoperability Item A | Manual check |
- | 6.2.3b Interoperability Item B | Manual check |
- | 6.2.3c Interoperability Item C | Manual check |
- | 6.2.3d Interoperability Item D | Manual check |
- | 6.2.4 RTT Responsiveness | Manual check |
- | 6.3 Caller ID | Manual check |
- | 6.4 Alternatives to Voice Based Services | Manual check |
- | 6.5.2 Resolution Item A | Manual check |
- | 6.5.3 Frame Rate Item A | Manual check |
- | 6.5.4 Synchronization between Audio and Video | Manual check |
- | 6.5.5 Visual Indicator of Audio with Video | Manual check |
- | 6.5.6 Speaker Identification with Video Communication | Manual check |
- | 7.1.1 Captioning Playback | Manual check |
- | 7.1.2 Captioning Synchronization | Manual check |
- | 7.1.3 Preservation of Captioning | Manual check |
- | 7.1.4 Captions Characteristics | Manual check |
- | 7.1.5 Spoken Subtitles | Manual check |
- | 7.2.1 Audio Description Playback | Manual check |
- | 7.2.2 Audio Description Synchronization | Manual check |
- | 7.2.3 Preservation of Audio Description | Manual check |
- | 7.3 User Controls for Captions and Audio Description | Manual check |
1.1.1 Non-text Content |
|
Manual check |
1.2.1 Audio-only and Video-only (Prerecorded) |
|
Manual check |
1.2.3 Audio Description or Media Alternatives (Prerecorded) |
|
Manual check |
1.2.5 Audio Description (Prerecorded) |
|
Manual check |
1.3.1 Info and Relationships |
|
Manual check |
1.3.2 Meaningful Sequence |
|
Manual check |
1.3.3 Sensory Characteristics |
|
Manual check |
1.3.4 Orientation |
|
Manual check |
1.3.5 Identify Input Purpose |
|
Manual check |
1.4.1 Use of Color |
|
Manual check |
1.4.2 Audio Control |
|
Manual check |
1.4.3 Contrast (Minimum) |
|
Manual check |
1.4.4 Resize text |
|
Manual check |
1.4.5 Images of text |
|
Manual check |
1.4.10 Reflow |
|
Manual check |
1.4.11 Non-text Contrast |
|
Manual check |
1.4.12 Text Spacing |
|
Manual check |
1.4.13 Content on Hover or Focus |
|
Manual check |
2.1.1 Keyboard |
|
Manual check |
2.1.2 No Keyboard Trap |
|
Manual check |
2.1.4 Character Key Shortcuts |
|
Manual check |
2.2.1 Timing Adjustable |
|
Manual check |
2.2.2 Pause, Stop, Hide |
|
Manual check |
2.3.1 Three Flashes or Below Threshold |
|
Manual check |
2.4.1 Bypass Blocks | 9.2.4.1 Bypass blocks | Manual check |
2.4.2 Page Titled |
|
Manual check |
2.4.3 Focus Order |
|
Manual check |
2.4.4 Link Purpose (In Context) |
|
Manual check |
2.4.5 Multiple Ways | 9.2.4.5 Multiple ways | Manual check |
2.4.6 Headings and Labels |
|
Manual check |
2.4.7 Focus Visible |
|
Manual check |
2.5.1 Pointer Gestures |
|
Manual check |
2.5.2 Pointer Cancellation |
|
Manual check |
2.5.3 Label in Name |
|
Manual check |
2.5.4 Motion Actuation |
|
Manual check |
3.1.1 Language of Page |
|
Manual check |
3.1.2 Language of Parts |
|
Manual check |
3.2.1 On Focus |
|
Manual check |
3.2.2 On Input |
|
Manual check |
3.2.3 Consistent Navigation | 9.3.2.3 Consistent navigation | Manual check |
3.2.4 Consistent Identification | 9.3.2.4 Consistent identification | Manual check |
3.3.1 Error Identification |
|
Manual check |
3.3.2 Labels or Instructions |
|
Manual check |
3.3.3 Error Suggestion |
|
Manual check |
3.3.4 Error Prevention (Legal, Financial, Data) |
|
Manual check |
4.1.1 Parsing |
|
Manual check |
4.1.2 Name, Role, Value |
|
Manual check |
4.1.3 Status Messages |
|
Manual check |
- | 9.6 WCAG conformance requirements | Manual check |
- | 11.7 User preferences | Manual check |
- | 11.8.1 Content technology | Manual check |
- | 11.8.2 Accessible content creation | Manual check |
- | 11.8.3 Preservation of accessibility information in transformations | Manual check |
- | 11.8.4 Repair assistance | Manual check |
- | 11.8.5 Templates | Manual check |
- | 12.1.1 Accessibility and compatibility features | Manual check |
- | 12.1.2 Accessible documentation | Manual check |
- | 12.2.2 Information on accessibility and compatibility features | Manual check |
- | 12.2.3 Effective communication | Manual check |
- | 12.2.4 Accessible documentation | Manual check |
Simplified checks websites
For the simplified checks, a selection of the WCAG success criteria was checked. The checks were limited to 13 WCAG success criteria, which can be tested at least partially automatically. They include success criteria for all 4 principles: perceptible, operable, understandable and robust. The success criteria depend on the needs of the users with regard to barrier-free access. The "Simplified Checks Websites Correspondence Table", which is available below for unfolding, describes the relationship with the selected success criteria in accordance with EN 301 549. For the federal government, only classic web content was checked; Tyrol also checked PDF documents during the simplified checks, if available.
The simplified checks are non-conformity checks. Due to the automated checks of a selection of WCAG success criteria, compliance cannot be determined. In addition, the criteria are usually not fully checked, but a number of individual automated checks are carried out. These are listed in the column "Qual Web Rule ID" in the "Simplified Checks Websites correlation table below". In the case of the simplified checks, it is therefore checked whether errors can be found with the individual automated checks for the selected criteria, and not whether criteria are met.
Explanations of the checks mentioned in the correlation table can be found on the website of the ACT-Rules Community (external link). The respective declaration is linked to each rule.
Tyrol also carried outpartial manual checks for the simplified checks on the accessibility declaration, imprint and for existing contact functionalities.
The following test tools were used during the three monitoring periods:
- Monitoring period 2022:
- QualWeb Core (version 0.7.10, 23.11.2021)
- Tyrol: QualWeb and additional HTML Validator from W3C, Koa11y, WAVE-Chrome Extension, JAWS-Fusion 2022
- Vienna: Google lighthouse
- Monitoring period 2023:
- QualWeb Core (version 0.7.28, 21.11.2022)
- Tyrol: QualWeb and additional HTML Validator from W3C, WAVE- Chrome Extension, JAWS-Fusion 2023
- Vienna: QualWeb and Google Lighthouse
- Monitoring period 2024:
- QualWeb Core (Version 0.7.46, 01.12.2023)
- Tyrol: QualWeb and additionally Koa11y, WAVE-Chrome Extension, JAWS-Fusion 2024
- Vienna: QualWeb and Google Lighthouse
Sampling of the subpages to be checked
For the simplified checks, the following subpages of each website were selected for review:
- Homepage
- A number of subpages proportionate to the estimated size and complexity of the website: In addition to the start page, 15 to 20 subpages were also selected for the examination. Based on the main navigation, subpages were selected that have a different appearance or functionalities than the subpages previously included in the sample (e.g. forms, actionable elements, images, tables, dynamic content, structure, other layout).
In Tyrol, subpages with different functionalities or different technical implementations as well as a PDF file were also selected. 5 to 8 subpages were checked.
Correlation table simplified checks websites
WCAG success criteria | Clause from EN 301 549 | Context with accessibility-needs of users: | Checks performed (QualWeb Rule ID) (all links external) |
---|---|---|---|
1.1.1 Non-text Content | 9.1.1.1 Non-text content | Primary context:
|
|
1.3.1 Info and Relationships | 9.1.3.1 Info and relationships | Primary context:
|
|
1.3.4 Orientation | 9.1.3.4 Orientation | Primary context:
|
QW-ACT-R7 „Orientation of the page is not restricted using CSS transform property“ |
1.4.3 Contrast (Minimum) | 9.1.4.3 Contrast (minimum) | Primary context:
|
QW-ACT-R37 „Text has minimum contrast“ |
1.4.4 Resize text | 9.1.4.4 Resize text | Primary context:
|
QW-ACT-R14 „Meta viewport allows for zoom“ |
1.4.12 Text Spacing | 9.1.4.12 Text spacing | Primary context:
|
|
2.1.1 Keyboard | 9.2.1.1 Keyboard | Primary context:
|
|
2.4.2 Page Titled | 9.2.4.2 Page titled | Primary context:
|
QW-ACT-R1 „HTML page has non-empty title“ |
2.4.4 Link Purpose (In Context) | 9.2.4.4 Link purpose (in context) | Primary context:
|
QW-ACT-R12 „Link has non-empty accessible name“ |
2.5.3 Label in Name | 9.2.5.3 Label in name | Primary context:
|
QW-ACT-R30 „Visible label is part of accessible name“ |
3.1.1 Language of Page | 9.3.1.1 Language of page | Primary context:
|
|
4.1.1 Parsing | 9.4.1.1 Parsing | Primary context:
|
2022 & 2023 (danach standardmäßig auf „passed“ gesetzt aufgrund von diesbezüglicher Aktualisierung in den WCAG 2.1): QW-ACT-R18 „Id attribute value is unique“ |
4.1.2 Name, Role, Value | 9.4.1.2 Name, role, value | Primary context:
|
|
In-depth checks apps
For the in-depth checks of the apps, all requirements listed in Table A.2 of Annex A to EN 301 549 V3.2.1 (2021-03) were checked. For documents (PDF documents or similar), the criteria set out in Chapter 10 of EN 301 549 V3.2.1 (2021-03) were checked. These largely correspond to the WCAG success criteria of version 2.1, conformity level A and AA. A detailed correlation table is available below for unfolding.
As with the incoming checks of the websites, the following WCAG criteria are excluded from the check of documents according to EN 301 549:
- 2.4.1 Bypass blocks
- 2.4.5 Multiple ways
- 3.2.3 Consistent navigation
- 3.2.4 Consistent identification
In addition, according to EN 301 549, the following WCAG criteria are excluded when checking apps:
- 2.4.1 Bypass blocks
- 2.4.2 Page titled
- 2.4.5 Multiple ways
- 3.1.2 Language of Parts
- 3.2.3 Consistent navigation
- 3.2.4 Consistent identification
All in-depth checked apps were checked for compliance with the criteria by means of detailed manual checks. Blind and visually impaired testers were involved in the checks. The checks were based on the WCAG-EM (Website Accessibility Coonformance Evaluation Methodology) (external link). For this purpose, the following test steps were carried out:
- Use of analysis tools (where possible) for a first impression of the apps
- Visual and auditory examination (contrasts, examination of multimedia content, etc.)
- Usually no access to the source code possible, only with Progressive Web Apps a code analysis can be made
- Verification of gesture and motion control (multipoint or path-based gestures, alternatives to control by moving the device)
- Testing with screen readers
The following tools were used to support this:
- Android
- Accessibility scanner
- Color Picker and WebAIM Contrast Checker
- Screenreader TalkBack
- iOS
- Color contrast
- Screenreader VoiceOver
For the additional criteria from EN 301 549, it was checked whether an error or no errors could be found for the respective criterion. Not every, possibly existing error was documented, but only the first error found, or, if no error was found in total, the criterion was rated as "fulfilled".
Sampling of screens to be checked
For the incoming checks of apps, the following screens of the individual apps (if available) were selected for review – the selection was analogous to the incoming checks of the websites:
- Start screen, screens with login area, screen overview (sitemap), contact, help and screens with legal information
- At least one relevant screen for each type of service provided by the app and for each other main purpose including search function
- The screens with the accessibility statement or information and the screens with the feedback mechanism
- Five sample screens with a distinctly different look and feel or different types of content
- At least one relevant retrievable document, if any, for each type of service provided by the App and for each other main purpose
- Randomly selected screens of at least 10% of the sample defined by the previous steps
If any of the subpages selected above contains a step in a procedure, all steps of the procedure are checked. This means that all screens necessary to go through the entire process are checked.
Correlation table in-depth checks apps
The majority of clauses in EN 301 549 with clause 11 refer to classic apps, clauses with clause 10 to documents. In addition, further criteria from EN 301 549 were reviewed. Note: Appropriate tools were used to support the manual checks carried out.
WCAG Success Criteria | Clause in EN 301 549 | Checks |
---|---|---|
- | 5.2 Activation of accessibility features | Manual check |
- | 5.3 Biometrics | Manual check |
- | 5.4 Preservation of accessibility information during conversion | Manual check |
- | 5.5.1 Means of operation | Manual check |
- | 5.5.2 Operable parts discernibility | Manual check |
- | 5.6.1 Tactile or auditory status | Manual check |
- | 5.6.2 Visual status | Manual check |
- | 5.7 Key repeat | Manual check |
- | 5.8 Double-strike key acceptance | Manual check |
- | 5.9 Simultaneous user actions | Manual check |
- | 6.1 Audio bandwidth for speech | Manual check |
- | 6.2.1.1 RTT communication | Manual check |
- | 6.2.1.2 Concurrent Voice and Text | Manual check |
- | 6.2.2.1 Visually Distinguishable Display | Manual check |
- | 6.2.2.2 Programmatically Determinable Send and Receive Direction | Manual check |
- | 6.2.2.3 Speaker Identification | Manual check |
- | 6.2.2.4 Visual Indicator of Audio with RTT | Manual check |
- | 6.2.3a Interoperability Item A | Manual check |
- | 6.2.3b Interoperability Item B | Manual check |
- | 6.2.3c Interoperability Item C | Manual check |
- | 6.2.3d Interoperability Item D | Manual check |
- | 6.2.4 RTT Responsiveness | Manual check |
- | 6.3 Caller ID | Manual check |
- | 6.4 Alternatives to Voice Based Services | Manual check |
- | 6.5.2 Resolution Item A | Manual check |
- | 6.5.3 Frame Rate Item A | Manual check |
- | 6.5.4 Synchronization between Audio and Video | Manual check |
- | 6.5.5 Visual Indicator of Audio with Video | Manual check |
- | 6.5.6 Speaker Identification with Video Communication | Manual check |
- | 7.1.1 Captioning Playback | Manual check |
- | 7.1.2 Captioning Synchronization | Manual check |
- | 7.1.3 Preservation of Captioning | Manual check |
- | 7.1.4 Captions Characteristics | Manual check |
- | 7.1.5 Spoken Subtitles | Manual check |
- | 7.2.1 Audio Description Playback | Manual check |
- | 7.2.2 Audio Description Synchronization | Manual check |
- | 7.2.3 Preservation of Audio Description | Manual check |
- | 7.3 User Controls for Captions and Audio Description | Manual check |
1.1.1 Non-text Content |
|
Manual check |
- | 11.1.1.1.2 Non-text content (closed functionality) | Manual check |
1.2.1 Audio-only and Video-only (Prerecorded) |
|
Manual check |
- | 11.1.2.1.2 Audio-only and video-only (pre-recorded - closed functionality) | Manual check |
1.2.2 Captions (Prerecorded) |
|
Manual check |
1.2.3 Audio Description or Media Alternatives (Prerecorded) |
|
Manual check |
- | 11.1.2.3.2 Audio description or media alternative (pre-recorded - closed functionality) | Manual check |
1.2.5 Audio Description (Prerecorded) |
|
Manual check |
1.3.1 Info and Relationships |
|
Manual check |
1.3.2 Meaningful Sequence |
|
Manual check |
1.3.3 Sensory Characteristics |
|
Manual check |
1.3.4 Orientation |
|
Manual check |
1.3.5 Identify Input Purpose |
|
Manual check |
- | 11.1.3.5.2 Identify input purpose (closed functionality) | Manual check |
1.4.1 Use of Color |
|
Manual check |
1.4.2 Audio Control |
|
Manual check |
1.4.3 Contrast (Minimum) |
|
Manual check |
1.4.4 Resize text |
|
Manual check |
- | 11.1.4.4.2 Resize text (closed functionality) | Manual check |
1.4.5 Images of text |
|
Manual check |
- | 11.1.4.5.2 Images of text (closed functionality) | Manual check |
1.4.10 Reflow |
|
Manual check |
1.4.11 Non-text Contrast |
|
Manual check |
1.4.12 Text Spacing |
|
Manual check |
1.4.13 Content on Hover or Focus |
|
Manual check |
2.1.1 Keyboard |
|
Manual check |
- | 11.2.1.1.2 Keyboard (closed functionality) | Manual check |
2.1.2 No Keyboard Trap |
|
Manual check |
2.1.4 Character Key Shortcuts |
|
Manual check |
- | 11.2.1.4.2 Character key shortcuts (closed functionality) | Manual check |
2.2.1 Timing Adjustable |
|
Manual check |
2.2.2 Pause, Stop, Hide |
|
Manual check |
2.3.1 Three Flashes or Below Threshold |
|
Manual check |
2.4.2 Page Titled | 10.2.4.2 Document titled | Manual check |
2.4.3 Focus Order |
|
Manual check |
2.4.4 Link Purpose (In Context) |
|
Manual check |
2.4.6 Headings and Labels |
|
Manual check |
2.4.7 Focus Visible |
|
Manual check |
2.5.1 Pointer Gestures |
|
Manual check |
2.5.2 Pointer Cancellation |
|
Manual check |
2.5.3 Label in Name |
|
Manual check |
2.5.4 Motion Actuation |
|
Manual check |
3.1.1 Language of Page |
|
Manual check |
- | 11.3.1.1.2 Language of software (closed functionality) | Manual check |
3.1.2 Language of Parts | 10.3.1.2 Language of parts | Manual check |
3.2.1 On Focus |
|
Manual check |
3.2.2 On Input |
|
Manual check |
3.3.1 Error Identification |
|
Manual check |
- | 11.3.3.1.2 Error Identification (closed functionality) | Manual check |
3.3.2 Labels or Instructions |
|
Manual check |
3.3.3 Error Suggestion |
|
Manual check |
3.3.4 Error Prevention (Legal, Financial, Data) |
|
Manual check |
4.1.1 Parsing |
|
Manual check |
4.1.2 Name, Role, Value |
|
Manual check |
4.1.3 Status Messages |
|
Manual check |
- | 11.5.2.3 Use of accessibility services | Manual check |
- | 11.5.2.5 Object information | Manual check |
- | 11.5.2.6 Row, column, and headers | Manual check |
- | 11.5.2.7 Values | Manual check |
- | 11.5.2.8 Label relationships | Manual check |
- | 11.5.2.9 Parent-child relationships | Manual check |
- | 11.5.2.10 Text | Manual check |
- | 11.5.2.11 List of available actions | Manual check |
- | 11.5.2.12 Execution of available actions | Manual check |
- | 11.5.2.13 Tracking of focus and selection attributes | Manual check |
- | 11.5.2.14 Modification of focus and selection attributes | Manual check |
- | 11.5.2.15 Change notification | Manual check |
- | 11.5.2.16 Modifications of states and properties | Manual check |
- | 11.5.2.17 Modifications of values and text | Manual check |
- | 11.6.2 No disruption of accessibility features | Manual check |
- | 11.7 User preferences | Manual check |
- | 11.8.1 Content technology | Manual check |
- | 11.8.2 Accessible content creation | Manual check |
- | 11.8.3 Preservation of accessibility information in transformations | Manual check |
- | 11.8.4 Repair assistance | Manual check |
- | 11.8.5 Templates | Manual check |
- | 12.1.1 Accessibility and compatibility features | Manual check |
- | 12.1.2 Accessible documentation | Manual check |
- | 12.2.2 Information on accessibility and compatibility features | Manual check |
- | 12.2.3 Effective communication | Manual check |
- | 12.2.4 Accessible documentation | Manual check |
Communication with sampled entities
For all three types of checks (simplified checks websites, in-depth checks websites and in-depth checks apps),the results were documented in the form of a report using the WAD Report Tool (external link). The result of the audit was sent to the audited entity after a quality check.
Tirol submitted a more detailed report to the simplified audited institutions. On the one hand, this is an error report with background information, but also contains recommendations for action and drafting suggestions for the accessibility declaration.
In order to ensure that there were no delays during the audit process, the incoming checks contacted the entities immediately after the sampling. On the one hand, the responsible contact person was determined and on the other hand, access data were queried if necessary.
Particularly the in-depth audited institutions were offered consultations, which were gladly accepted by the public institutions. These appointments lasted about 90 minutes. Parts of the report were discussed and concrete questions of the institutions or the supervising web agencies could be answered.