Hey,
today I noticed that our app doesn’t build the semantics nodes on web.
The screen reader can only access input fields, nothing else gets read.
When I try to log the semantics tree with debugDumpSemanticsTree(), it tells me this although I have VoiceOver activated:
“For performance reasons, the framework only generates semantics when asked to do so by the platform.
js_primitives.dart:28 Usually, platforms only ask for semantics when assistive technologies (like screen readers) are running.
js_primitives.dart:28 To generate semantics, try turning on an assistive technology (like VoiceOver or TalkBack) on your device.”
I already tried to add SemanticsBinding.instance.ensureSemantics to main.dart but that didn’t make any difference.
I’m not sure if we maybe just did something wrong because the project is quite big and for a smaller project the semantics nodes seem to get built (I wrote a minimal test with a minimal app to test if Maestro works now with Flutter web and there the semantics node can be found by Maestro).
Unfortunately I can’t share the whole code of our project since it’s closed source but probably someone has another idea?
I tried to wrap an ElevatedButton in
Semantics(
label: label,
identifier: label,
button: true,
container: true,
explicitChildNodes: true,
focusable: true,
...)
The button already contains visible text but neither the semantics label nor the button text can be found by a screen reader.
What puzzles me even more is that I saw the semantics tree was rendered completely once today but the label for the button said “Continue Continue” instead of just continue. Then I tried to debug it and when I rebuilt the app, the semantics tree was gone again and I couldn’t get it back anymore.
We currently use Flutter 3.41.4.
I would really appreciate any idea that could help solve this problem since we really need our app to be accessible.