• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle













  • Ours would say:

    Party 1

    • Round up all minorities and immigrants into camps. Create jobs ‘digging the death pits’.
    • Replace ‘Separation of church and state’ in the Constitution with a copy of the ‘Right to bare arms’ amendment, but this time in all caps.
    • Repeal all environmental protection laws. The “free market” will ensure companies don’t pollute. Even if they did, climate change is a hoax so it doesn’t matter.
    • Eliminate all healthcare plans. If God wants you dead, and you’re poor, so be it.
    • Eliminate term limits for conservative politicians. Implement term limits for liberal supreme court judges.

    Party 2

    • Laws that aren’t just a toddler’s power fantasy


  • I’m no expert in this subject either, but a theoretical limit could be beyond 200x - depending on the data.

    For example, a basic compression approach is to use a lookup table that allows you to map large values to smaller lookup ids. So, if the possible data only contains 2 values: One consisting of 10,000 letter 'a’s. The other is 10,000 letter 'b’s. We can map the first to number 1 and the second to number 2. With this lookup in place, a compressed value of “12211” would uncompress to 50,000 characters. A 10,000x compression ratio. Extrapolate that example out and there is no theoretical maximum to the compression ratio.

    But that’s when the data set is known and small. As the complexity grows, it does seem logical that a maximum limit would be introduced.

    So, it might be possible to achieve 200x compression, but only if the complexity of the data set is below some threshold I’m not smart enough to calculate.