Percentage and Probability types

Hi,

I thought I’d share a bit of code that’s making my life a tiny bit easier every day (as I’m working on my game, in particular). It’s the extension types Percentage and Probability. Here’s the latter one:

import 'package:logging/logging.dart';

/// A value that is guaranteed to be normalized between `0.0` and `1.0`,
/// like a probability should.
///
/// Useful to avoid confusion in code that accepts something like
/// `readiness` and we want to make it clear that the expected value
/// is a probability between `0` and `1`.
///
/// This is an extension type. The nice thing about those is that
/// they have zero runtime overhead.
extension type Probability._(double value) implements double {
  static final Logger _log = Logger('Probability');

  /// Prevents a million severe errors being reported.
  static bool _errorLoggedAlready = false;

  factory Probability(double value) {
    assert(value >= 0);
    assert(value <= 1);
    if (!_errorLoggedAlready && (value < 0 || value > 1)) {
      _log.severe(() => 'Tried to construct probability of $value.\n'
          '${StackTrace.current}');
      _errorLoggedAlready = true;
    }
    final clamped = value.clamp(0.0, 1.0);
    return Probability._(clamped);
  }

  /// Constructs a probability from a [double] value between `0.0` and `1.0`.
  /// This value is asserted to be in this range in debug mode.
  const Probability.compileTime(this.value)
      : assert(value >= 0),
        assert(value <= 1);

  /// A probability of `1.0` (or 100%).
  const Probability.sure() : value = 1.0;

  /// A probability of `0.0` (or 0%).
  const Probability.zero() : value = 0.0;
}

One cool thing that might not be obvious is that the assertions in the const constructur really are checked at compile time. So, if you write:

class Hit {
  static const _defaultChanceToHit = Probability.compileTime(60);
}

Then you immediately get red squiggly lines and an error.

The code for Percentage is basically identical. That one helps me be clear that an option (like “music volume” or “camera shake”) is supposed to go from 0 to 100 (and 100 means “max”).

Since these are extension types, you can use them as their underlying value type. So I have things like:

class _Slider extends StatelessWidget {
  final ValueNotifier<Percentage> valueNotifier;

  const _Slider(this.valueNotifier);

  @override
  Widget build(BuildContext context) {
    return ValueListenableBuilder(
      valueListenable: valueNotifier,
      builder: (context, percentage, child) {
        return SizedBox(
          width: ...,
          height: ...,
          child: FittedBox(
            child: Slider(
              value: percentage.toDouble(),
              min: 0,
              max: 100,
              divisions: 100,
              label: percentage.toString(),
              onChanged: (value) {
                final intValue = value.round();
                valueNotifier.value = Percentage(intValue);
              },
            ),
          ),
        );
      },
    );
  }
}

And it works as if it was ValueNotifier<int>, except its safer.

Anyway, I hope this helps someone out there.

13 Likes

It reminds me a lot of the ability of f# to define custom types of any base types that will inherit behavior but are not directly assignable so you can create a custom integer type myInt and it will only be assignable to itself.
Although in Dart your type will be assignable to other doubles.

Did you encounter such problems mixing up if something is a probability or just any number so often that it makes sense to define such types?

choosing sure as 100% to match character length with zero is a nice touch :pinched_fingers:

2 Likes

Absolutely it makes the code way more expressive.

Absolutely. When something is called readiness, and is typed as double, and you haven’t seen that part of the code for months, it’s easy to make a mistake. And I’ve made these mistakes many times.

It’s also easy (at least for me) to mistakenly assign an invalid value. For example, I remember having a bug where a piece of (wrong) math assigned something like 1.5 to something that should have been between 0.0 and 1.0. Since it used to be a double, there was of course nothing wrong with it — except the code execution went horribly wrong down the line, and I had no idea why. When I finally found the bug, I switched from double to NormalizedDouble (basically same code as Probability) and this class of errors never reappeared (because I don’t accept anything other than NormalizedDouble, and that one is asserted and clamped).

1 Like

I didn’t know that!

Is this a new feature? Feels like this wasn’t the case atleast a few months ago

Feels like a stepping stone towards DDD , XD

thanks looks beautiful