TSTP Solution File: HWV105+1 by Otter---3.3

View Problem - Process Solution

%------------------------------------------------------------------------------
% File     : Otter---3.3
% Problem  : HWV105+1 : TPTP v8.1.0. Released v6.1.0.
% Transfm  : none
% Format   : tptp:raw
% Command  : otter-tptp-script %s

% Computer : n006.cluster.edu
% Model    : x86_64 x86_64
% CPU      : Intel(R) Xeon(R) CPU E5-2620 v4 2.10GHz
% Memory   : 8042.1875MB
% OS       : Linux 3.10.0-693.el7.x86_64
% CPULimit : 300s
% WCLimit  : 300s
% DateTime : Wed Jul 27 12:58:40 EDT 2022

% Result   : Unknown 131.45s 130.46s
% Output   : None 
% Verified : 
% SZS Type : -

% Comments : 
%------------------------------------------------------------------------------
%----No solution output by system
%------------------------------------------------------------------------------
%----ORIGINAL SYSTEM OUTPUT
% 0.10/0.11  % Problem  : HWV105+1 : TPTP v8.1.0. Released v6.1.0.
% 0.10/0.12  % Command  : otter-tptp-script %s
% 0.12/0.33  % Computer : n006.cluster.edu
% 0.12/0.33  % Model    : x86_64 x86_64
% 0.12/0.33  % CPU      : Intel(R) Xeon(R) CPU E5-2620 v4 @ 2.10GHz
% 0.12/0.33  % Memory   : 8042.1875MB
% 0.12/0.33  % OS       : Linux 3.10.0-693.el7.x86_64
% 0.12/0.33  % CPULimit : 300
% 0.12/0.33  % WCLimit  : 300
% 0.12/0.33  % DateTime : Wed Jul 27 06:42:32 EDT 2022
% 0.12/0.33  % CPUTime  : 
% 121.24/120.26  ----- Otter 3.3f, August 2004 -----
% 121.24/120.26  The process was started by sandbox on n006.cluster.edu,
% 121.24/120.26  Wed Jul 27 06:42:32 2022
% 121.24/120.26  The command was "./otter".  The process ID is 27546.
% 121.24/120.26  
% 121.24/120.26  set(prolog_style_variables).
% 121.24/120.26  set(auto).
% 121.24/120.26     dependent: set(auto1).
% 121.24/120.26     dependent: set(process_input).
% 121.24/120.26     dependent: clear(print_kept).
% 121.24/120.26     dependent: clear(print_new_demod).
% 121.24/120.26     dependent: clear(print_back_demod).
% 121.24/120.26     dependent: clear(print_back_sub).
% 121.24/120.26     dependent: set(control_memory).
% 121.24/120.26     dependent: assign(max_mem, 12000).
% 121.24/120.26     dependent: assign(pick_given_ratio, 4).
% 121.24/120.26     dependent: assign(stats_level, 1).
% 121.24/120.26     dependent: assign(max_seconds, 10800).
% 121.24/120.26  clear(print_given).
% 121.24/120.26  
% 121.24/120.26  formula_list(usable).
% 121.24/120.26  all A (A=A).
% 121.24/120.26  nextState(constB8,constB9).
% 121.24/120.26  nextState(constB7,constB8).
% 121.24/120.26  nextState(constB6,constB7).
% 121.24/120.26  nextState(constB5,constB6).
% 121.24/120.26  nextState(constB4,constB5).
% 121.24/120.26  nextState(constB3,constB4).
% 121.24/120.26  nextState(constB2,constB3).
% 121.24/120.26  nextState(constB1,constB2).
% 121.24/120.26  nextState(constB0,constB1).
% 121.24/120.26  all VarNext VarCurr (nextState(VarCurr,VarNext)->reachableState(VarCurr)&reachableState(VarNext)).
% 121.24/120.26  all VarState (reachableState(VarState)->constB0=VarState|constB1=VarState|constB2=VarState|constB3=VarState|constB4=VarState|constB5=VarState|constB6=VarState|constB7=VarState|constB8=VarState|constB9=VarState|constB10=VarState|constB11=VarState|constB12=VarState|constB13=VarState|constB14=VarState|constB15=VarState|constB16=VarState|constB17=VarState|constB18=VarState|constB19=VarState|constB20=VarState).
% 121.24/120.26  reachableState(constB20).
% 121.24/120.26  reachableState(constB19).
% 121.24/120.26  reachableState(constB18).
% 121.24/120.26  reachableState(constB17).
% 121.24/120.26  reachableState(constB16).
% 121.24/120.26  reachableState(constB15).
% 121.24/120.26  reachableState(constB14).
% 121.24/120.26  reachableState(constB13).
% 121.24/120.26  reachableState(constB12).
% 121.24/120.26  reachableState(constB11).
% 121.24/120.26  reachableState(constB10).
% 121.24/120.26  reachableState(constB9).
% 121.24/120.26  reachableState(constB8).
% 121.24/120.26  reachableState(constB7).
% 121.24/120.26  reachableState(constB6).
% 121.24/120.26  reachableState(constB5).
% 121.24/120.26  reachableState(constB4).
% 121.24/120.26  reachableState(constB3).
% 121.24/120.26  reachableState(constB2).
% 121.24/120.26  reachableState(constB1).
% 121.24/120.26  reachableState(constB0).
% 121.24/120.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1(VarCurr)<-> -v1(VarNext))).
% 121.24/120.26  -v1(constB0).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_20,B)<->v1045(constB20,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_20).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB20,v1045_range_2_to_0_address_term_bound_20).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_19,B)<->v1045(constB19,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_19).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB19,v1045_range_2_to_0_address_term_bound_19).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_18,B)<->v1045(constB18,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_18).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB18,v1045_range_2_to_0_address_term_bound_18).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_17,B)<->v1045(constB17,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_17).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB17,v1045_range_2_to_0_address_term_bound_17).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_16,B)<->v1045(constB16,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_16).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB16,v1045_range_2_to_0_address_term_bound_16).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_15,B)<->v1045(constB15,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_15).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB15,v1045_range_2_to_0_address_term_bound_15).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_14,B)<->v1045(constB14,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_14).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB14,v1045_range_2_to_0_address_term_bound_14).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_13,B)<->v1045(constB13,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_13).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB13,v1045_range_2_to_0_address_term_bound_13).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_12,B)<->v1045(constB12,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_12).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB12,v1045_range_2_to_0_address_term_bound_12).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_11,B)<->v1045(constB11,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_11).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB11,v1045_range_2_to_0_address_term_bound_11).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_10,B)<->v1045(constB10,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_10).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB10,v1045_range_2_to_0_address_term_bound_10).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_9,B)<->v1045(constB9,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_9).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB9,v1045_range_2_to_0_address_term_bound_9).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_8,B)<->v1045(constB8,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_8).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB8,v1045_range_2_to_0_address_term_bound_8).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_7,B)<->v1045(constB7,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_7).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB7,v1045_range_2_to_0_address_term_bound_7).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_6,B)<->v1045(constB6,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_6).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB6,v1045_range_2_to_0_address_term_bound_6).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_5,B)<->v1045(constB5,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_5).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB5,v1045_range_2_to_0_address_term_bound_5).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_4,B)<->v1045(constB4,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_4).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB4,v1045_range_2_to_0_address_term_bound_4).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_3,B)<->v1045(constB3,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_3).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB3,v1045_range_2_to_0_address_term_bound_3).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_2,B)<->v1045(constB2,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_2).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB2,v1045_range_2_to_0_address_term_bound_2).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_1,B)<->v1045(constB1,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_1).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB1,v1045_range_2_to_0_address_term_bound_1).
% 121.24/120.26  all B (addressVal(v1045_range_2_to_0_address_term_bound_0,B)<->v1045(constB0,B)).
% 121.24/120.26  address(v1045_range_2_to_0_address_term_bound_0).
% 121.24/120.26  v1045_range_2_to_0_address_association(constB0,v1045_range_2_to_0_address_term_bound_0).
% 121.24/120.26  address(b110_address_term).
% 121.24/120.26  all B (addressVal(b110_address_term,B)<->b110(B)).
% 121.24/120.26  address(b101_address_term).
% 121.24/120.26  all B (addressVal(b101_address_term,B)<->b101(B)).
% 121.24/120.26  address(b100_address_term).
% 121.24/120.26  all B (addressVal(b100_address_term,B)<->b100(B)).
% 121.24/120.26  address(b011_address_term).
% 121.24/120.26  all B (addressVal(b011_address_term,B)<->b011(B)).
% 121.24/120.26  address(b010_address_term).
% 121.24/120.26  all B (addressVal(b010_address_term,B)<->b010(B)).
% 121.24/120.26  address(b001_address_term).
% 121.24/120.26  all B (addressVal(b001_address_term,B)<->b001(B)).
% 121.24/120.26  address(b111_address_term).
% 121.24/120.26  all B (addressVal(b111_address_term,B)<->b111(B)).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_20,B)<->v977(constB20,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_20).
% 121.24/120.26  v977_range_2_to_0_address_association(constB20,v977_range_2_to_0_address_term_bound_20).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_19,B)<->v977(constB19,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_19).
% 121.24/120.26  v977_range_2_to_0_address_association(constB19,v977_range_2_to_0_address_term_bound_19).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_18,B)<->v977(constB18,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_18).
% 121.24/120.26  v977_range_2_to_0_address_association(constB18,v977_range_2_to_0_address_term_bound_18).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_17,B)<->v977(constB17,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_17).
% 121.24/120.26  v977_range_2_to_0_address_association(constB17,v977_range_2_to_0_address_term_bound_17).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_16,B)<->v977(constB16,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_16).
% 121.24/120.26  v977_range_2_to_0_address_association(constB16,v977_range_2_to_0_address_term_bound_16).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_15,B)<->v977(constB15,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_15).
% 121.24/120.26  v977_range_2_to_0_address_association(constB15,v977_range_2_to_0_address_term_bound_15).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_14,B)<->v977(constB14,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_14).
% 121.24/120.26  v977_range_2_to_0_address_association(constB14,v977_range_2_to_0_address_term_bound_14).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_13,B)<->v977(constB13,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_13).
% 121.24/120.26  v977_range_2_to_0_address_association(constB13,v977_range_2_to_0_address_term_bound_13).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_12,B)<->v977(constB12,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_12).
% 121.24/120.26  v977_range_2_to_0_address_association(constB12,v977_range_2_to_0_address_term_bound_12).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_11,B)<->v977(constB11,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_11).
% 121.24/120.26  v977_range_2_to_0_address_association(constB11,v977_range_2_to_0_address_term_bound_11).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_10,B)<->v977(constB10,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_10).
% 121.24/120.26  v977_range_2_to_0_address_association(constB10,v977_range_2_to_0_address_term_bound_10).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_9,B)<->v977(constB9,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_9).
% 121.24/120.26  v977_range_2_to_0_address_association(constB9,v977_range_2_to_0_address_term_bound_9).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_8,B)<->v977(constB8,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_8).
% 121.24/120.26  v977_range_2_to_0_address_association(constB8,v977_range_2_to_0_address_term_bound_8).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_7,B)<->v977(constB7,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_7).
% 121.24/120.26  v977_range_2_to_0_address_association(constB7,v977_range_2_to_0_address_term_bound_7).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_6,B)<->v977(constB6,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_6).
% 121.24/120.26  v977_range_2_to_0_address_association(constB6,v977_range_2_to_0_address_term_bound_6).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_5,B)<->v977(constB5,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_5).
% 121.24/120.26  v977_range_2_to_0_address_association(constB5,v977_range_2_to_0_address_term_bound_5).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_4,B)<->v977(constB4,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_4).
% 121.24/120.26  v977_range_2_to_0_address_association(constB4,v977_range_2_to_0_address_term_bound_4).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_3,B)<->v977(constB3,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_3).
% 121.24/120.26  v977_range_2_to_0_address_association(constB3,v977_range_2_to_0_address_term_bound_3).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_2,B)<->v977(constB2,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_2).
% 121.24/120.26  v977_range_2_to_0_address_association(constB2,v977_range_2_to_0_address_term_bound_2).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_1,B)<->v977(constB1,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_1).
% 121.24/120.26  v977_range_2_to_0_address_association(constB1,v977_range_2_to_0_address_term_bound_1).
% 121.24/120.26  all B (addressVal(v977_range_2_to_0_address_term_bound_0,B)<->v977(constB0,B)).
% 121.24/120.26  address(v977_range_2_to_0_address_term_bound_0).
% 121.24/120.26  v977_range_2_to_0_address_association(constB0,v977_range_2_to_0_address_term_bound_0).
% 121.24/120.26  address(b000_address_term).
% 121.24/120.26  all B (addressVal(b000_address_term,B)<->b000(B)).
% 121.24/120.26  all B A2 A1 (address(A1)&address(A2)&addressDiff(A1,A2,B)->A1=A2| (addressVal(A1,B)<-> -addressVal(A2,B))).
% 121.24/120.26  all A1 A2 (addressDiff(A1,A2,bitIndex0)|addressDiff(A1,A2,bitIndex1)|addressDiff(A1,A2,bitIndex2)).
% 121.24/120.26  -(all VarCurr (reachableState(VarCurr)->v4(VarCurr))).
% 121.24/120.26  all VarCurr (-v4(VarCurr)<->v5182(VarCurr)).
% 121.24/120.27  all VarCurr (-v5182(VarCurr)<->v5183(VarCurr)).
% 121.24/120.27  all VarCurr (v5183(VarCurr)<->v5185(VarCurr)&v5207(VarCurr)).
% 121.24/120.27  all VarCurr (v5207(VarCurr)<->v5208(VarCurr)|v6(VarCurr,bitIndex4)).
% 121.24/120.27  all VarCurr (v5208(VarCurr)<->v5209(VarCurr)|v6(VarCurr,bitIndex3)).
% 121.24/120.27  all VarCurr (v5209(VarCurr)<->v5210(VarCurr)|v6(VarCurr,bitIndex2)).
% 121.24/120.27  all VarCurr (v5210(VarCurr)<->v6(VarCurr,bitIndex0)|v6(VarCurr,bitIndex1)).
% 121.24/120.27  all VarCurr (v5185(VarCurr)<->v5186(VarCurr)|v5195(VarCurr)).
% 121.24/120.27  all VarCurr (v5195(VarCurr)<->v5196(VarCurr)|v5197(VarCurr)).
% 121.24/120.27  all VarCurr (v5197(VarCurr)<->v5187(VarCurr)&v5198(VarCurr)).
% 121.24/120.27  all VarCurr (v5198(VarCurr)<->v5199(VarCurr)|v5200(VarCurr)).
% 121.24/120.27  all VarCurr (v5200(VarCurr)<->v5189(VarCurr)&v5201(VarCurr)).
% 121.24/120.27  all VarCurr (v5201(VarCurr)<->v5202(VarCurr)|v5203(VarCurr)).
% 121.24/120.27  all VarCurr (v5203(VarCurr)<->v5191(VarCurr)&v5204(VarCurr)).
% 121.24/120.27  all VarCurr (v5204(VarCurr)<->v5205(VarCurr)|v5206(VarCurr)).
% 121.24/120.27  all VarCurr (v5206(VarCurr)<->v6(VarCurr,bitIndex0)&v5194(VarCurr)).
% 121.24/120.27  all VarCurr (v5205(VarCurr)<->v5193(VarCurr)&v6(VarCurr,bitIndex1)).
% 121.24/120.27  all VarCurr (v5202(VarCurr)<->v6(VarCurr,bitIndex2)&v5192(VarCurr)).
% 121.24/120.27  all VarCurr (v5199(VarCurr)<->v6(VarCurr,bitIndex3)&v5190(VarCurr)).
% 121.24/120.27  all VarCurr (v5196(VarCurr)<->v6(VarCurr,bitIndex4)&v5188(VarCurr)).
% 121.24/120.27  all VarCurr (v5186(VarCurr)<->v5187(VarCurr)&v5188(VarCurr)).
% 121.24/120.27  all VarCurr (v5188(VarCurr)<->v5189(VarCurr)&v5190(VarCurr)).
% 121.24/120.27  all VarCurr (v5190(VarCurr)<->v5191(VarCurr)&v5192(VarCurr)).
% 121.24/120.27  all VarCurr (v5192(VarCurr)<->v5193(VarCurr)&v5194(VarCurr)).
% 121.24/120.27  all VarCurr (-v5194(VarCurr)<->v6(VarCurr,bitIndex1)).
% 121.24/120.27  all VarCurr (-v5193(VarCurr)<->v6(VarCurr,bitIndex0)).
% 121.24/120.27  all VarCurr (-v5191(VarCurr)<->v6(VarCurr,bitIndex2)).
% 121.24/120.27  all VarCurr (-v5189(VarCurr)<->v6(VarCurr,bitIndex3)).
% 121.24/120.27  all VarCurr (-v5187(VarCurr)<->v6(VarCurr,bitIndex4)).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5170(VarNext)-> (all B (range_4_1(B)-> (v6(VarNext,B)<->v6(VarCurr,B)))))).
% 121.24/120.27  all B (range_4_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B).
% 121.24/120.27  all VarNext (v5170(VarNext)-> (v6(VarNext,bitIndex4)<->v5178(VarNext,bitIndex3))& (v6(VarNext,bitIndex3)<->v5178(VarNext,bitIndex2))& (v6(VarNext,bitIndex2)<->v5178(VarNext,bitIndex1))& (v6(VarNext,bitIndex1)<->v5178(VarNext,bitIndex0))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v5178(VarNext,B)<->v5176(VarCurr,B))))).
% 121.24/120.27  all VarCurr (-v5165(VarCurr)-> (v5176(VarCurr,bitIndex3)<->v19(VarCurr,bitIndex4))& (v5176(VarCurr,bitIndex2)<->v19(VarCurr,bitIndex3))& (v5176(VarCurr,bitIndex1)<->v19(VarCurr,bitIndex2))& (v5176(VarCurr,bitIndex0)<->v19(VarCurr,bitIndex1))).
% 121.24/120.27  all VarCurr (v5165(VarCurr)-> (all B (range_3_0(B)-> (v5176(VarCurr,B)<->$F)))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5170(VarNext)<->v5171(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5171(VarNext)<->v5173(VarNext)&v1252(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5173(VarNext)<->v1259(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5156(VarNext)-> (v6(VarNext,bitIndex0)<->v6(VarCurr,bitIndex0)))).
% 121.24/120.27  all VarNext (v5156(VarNext)-> (v6(VarNext,bitIndex0)<->v5164(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5164(VarNext)<->v5162(VarCurr))).
% 121.24/120.27  all VarCurr (-v5165(VarCurr)-> (v5162(VarCurr)<->v19(VarCurr,bitIndex0))).
% 121.24/120.27  all VarCurr (v5165(VarCurr)-> (v5162(VarCurr)<->$T)).
% 121.24/120.27  all VarCurr (-v5165(VarCurr)<->v8(VarCurr)).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5156(VarNext)<->v5157(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5157(VarNext)<->v5158(VarNext)&v1252(VarNext))).
% 121.24/120.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5158(VarNext)<->v1259(VarNext))).
% 121.24/120.27  all VarCurr (-v5144(VarCurr)-> (v19(VarCurr,bitIndex4)<->$F)).
% 121.24/120.27  all VarCurr (v5144(VarCurr)-> (v19(VarCurr,bitIndex4)<->$T)).
% 121.24/120.27  all VarCurr (v5144(VarCurr)<->v5145(VarCurr)|v5148(VarCurr)).
% 121.24/120.27  all VarCurr (v5148(VarCurr)<->v5149(VarCurr)&v1348(VarCurr)).
% 121.24/120.27  all VarCurr (v5149(VarCurr)<->v5150(VarCurr)|v5152(VarCurr)).
% 121.24/120.27  all VarCurr (v5152(VarCurr)<-> (v5153(VarCurr,bitIndex1)<->$F)& (v5153(VarCurr,bitIndex0)<->$T)).
% 121.24/120.27  all VarCurr (v5153(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.24/120.27  all VarCurr (v5153(VarCurr,bitIndex1)<->v1193(VarCurr)).
% 121.24/120.27  all VarCurr (v5150(VarCurr)<-> (v5151(VarCurr,bitIndex1)<->$F)& (v5151(VarCurr,bitIndex0)<->$F)).
% 121.24/120.27  all VarCurr (v5151(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.24/120.27  all VarCurr (v5151(VarCurr,bitIndex1)<->v1193(VarCurr)).
% 121.24/120.27  all VarCurr (v5145(VarCurr)<->v5146(VarCurr)&v1344(VarCurr)).
% 121.24/120.27  all VarCurr (v5146(VarCurr)<-> (v5147(VarCurr,bitIndex1)<->$T)& (v5147(VarCurr,bitIndex0)<->$F)).
% 121.24/120.27  all VarCurr (v5147(VarCurr,bitIndex0)<->v1193(VarCurr)).
% 121.24/120.27  all VarCurr (v5147(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.24/120.27  all VarCurr (-v5139(VarCurr)-> (v19(VarCurr,bitIndex3)<->$F)).
% 121.24/120.27  all VarCurr (v5139(VarCurr)-> (v19(VarCurr,bitIndex3)<->$T)).
% 121.24/120.27  all VarCurr (v5139(VarCurr)<->v5140(VarCurr)|v5142(VarCurr)).
% 121.24/120.27  all VarCurr (v5142(VarCurr)<->v1336(VarCurr)&v1344(VarCurr)).
% 121.24/120.27  all VarCurr (v5140(VarCurr)<->v5141(VarCurr)&v1323(VarCurr)).
% 121.24/120.27  all VarCurr (-v5141(VarCurr)<->v1193(VarCurr)).
% 121.24/120.27  all VarCurr (-v5135(VarCurr)-> (v19(VarCurr,bitIndex2)<->$F)).
% 121.24/120.27  all VarCurr (v5135(VarCurr)-> (v19(VarCurr,bitIndex2)<->$T)).
% 121.24/120.27  all VarCurr (v5135(VarCurr)<->v5136(VarCurr)|v5137(VarCurr)).
% 121.24/120.27  all VarCurr (v5137(VarCurr)<->v1326(VarCurr)&v1333(VarCurr)).
% 121.24/120.27  all VarCurr (v5136(VarCurr)<->v1193(VarCurr)&v1323(VarCurr)).
% 121.24/120.27  all VarCurr (-v5125(VarCurr)-> (v19(VarCurr,bitIndex1)<->$F)).
% 121.24/120.27  all VarCurr (v5125(VarCurr)-> (v19(VarCurr,bitIndex1)<->$T)).
% 121.24/120.27  all VarCurr (v5125(VarCurr)<->v5126(VarCurr)|v5133(VarCurr)).
% 121.24/120.27  all VarCurr (v5133(VarCurr)<->v1346(VarCurr)&v1348(VarCurr)).
% 121.24/120.27  all VarCurr (v5126(VarCurr)<->v5127(VarCurr)|v5131(VarCurr)).
% 121.24/120.27  all VarCurr (v5131(VarCurr)<->v5132(VarCurr)&v1344(VarCurr)).
% 121.24/120.27  all VarCurr (v5132(VarCurr)<->v1308(VarCurr)&v1342(VarCurr)).
% 121.24/120.27  all VarCurr (v5127(VarCurr)<->v5128(VarCurr)|v5130(VarCurr)).
% 121.24/120.27  all VarCurr (v5130(VarCurr)<->v1331(VarCurr)&v1333(VarCurr)).
% 121.24/120.27  all VarCurr (v5128(VarCurr)<->v5129(VarCurr)&v1322(VarCurr)).
% 121.24/120.27  all VarCurr (v5129(VarCurr)<->v1317(VarCurr)&v1320(VarCurr)).
% 121.24/120.27  all VarCurr (-v5112(VarCurr)-> (v19(VarCurr,bitIndex0)<->$F)).
% 121.24/120.27  all VarCurr (v5112(VarCurr)-> (v19(VarCurr,bitIndex0)<->$T)).
% 121.24/120.27  all VarCurr (v5112(VarCurr)<->v5113(VarCurr)|v5123(VarCurr)).
% 121.24/120.28  all VarCurr (v5123(VarCurr)<->v2061(VarCurr)&v1348(VarCurr)).
% 121.24/120.28  all VarCurr (v5113(VarCurr)<->v5114(VarCurr)|v5120(VarCurr)).
% 121.24/120.28  all VarCurr (v5120(VarCurr)<->v5121(VarCurr)&v1344(VarCurr)).
% 121.24/120.28  all VarCurr (v5121(VarCurr)<->v5122(VarCurr)&v1342(VarCurr)).
% 121.24/120.28  all VarCurr (-v5122(VarCurr)<->v1308(VarCurr)).
% 121.24/120.28  all VarCurr (v5114(VarCurr)<->v5115(VarCurr)|v5119(VarCurr)).
% 121.24/120.28  all VarCurr (v5119(VarCurr)<->v2056(VarCurr)&v1333(VarCurr)).
% 121.24/120.28  all VarCurr (v5115(VarCurr)<->v5116(VarCurr)&v1322(VarCurr)).
% 121.24/120.28  all VarCurr (v5116(VarCurr)<->v5117(VarCurr)|v5118(VarCurr)).
% 121.24/120.28  all VarCurr (-v5118(VarCurr)<->v1320(VarCurr)).
% 121.24/120.28  all VarCurr (v5117(VarCurr)<->v2053(VarCurr)&v1320(VarCurr)).
% 121.24/120.28  all VarCurr (-v23(VarCurr)-> (all B (range_1_0(B)-> (v21(VarCurr,B)<->v5093(VarCurr,B))))).
% 121.24/120.28  all VarCurr (v23(VarCurr)-> (all B (range_1_0(B)-> (v21(VarCurr,B)<->$F)))).
% 121.24/120.28  all VarCurr (-v5094(VarCurr)& -v5102(VarCurr)& -v5103(VarCurr)-> (all B (range_1_0(B)-> (v5093(VarCurr,B)<->$T)))).
% 121.24/120.28  all VarCurr (v5103(VarCurr)-> (all B (range_1_0(B)-> (v5093(VarCurr,B)<->$F)))).
% 121.24/120.28  all VarCurr (v5102(VarCurr)-> (all B (range_1_0(B)-> (v5093(VarCurr,B)<->b10(B))))).
% 121.24/120.28  all VarCurr (v5094(VarCurr)-> (all B (range_1_0(B)-> (v5093(VarCurr,B)<->b01(B))))).
% 121.24/120.28  all VarCurr (v5103(VarCurr)<->v5105(VarCurr)|v5110(VarCurr)).
% 121.24/120.28  all VarCurr (v5110(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$T)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$T)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr (v5105(VarCurr)<->v5106(VarCurr)|v5109(VarCurr)).
% 121.24/120.28  all VarCurr (v5109(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$T)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$T)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$T)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr (v5106(VarCurr)<->v5107(VarCurr)|v5108(VarCurr)).
% 121.24/120.28  all VarCurr (v5108(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$T)& (v2339(VarCurr,bitIndex5)<->$T)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr (v5107(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$T)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr (v5102(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$T)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$T)).
% 121.24/120.28  -b0001001(bitIndex6).
% 121.24/120.28  -b0001001(bitIndex5).
% 121.24/120.28  -b0001001(bitIndex4).
% 121.24/120.28  b0001001(bitIndex3).
% 121.24/120.28  -b0001001(bitIndex2).
% 121.24/120.28  -b0001001(bitIndex1).
% 121.24/120.28  b0001001(bitIndex0).
% 121.24/120.28  all VarCurr (v5094(VarCurr)<->v5096(VarCurr)|v5101(VarCurr)).
% 121.24/120.28  all VarCurr (v5101(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$T)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$T)).
% 121.24/120.28  -b0100001(bitIndex6).
% 121.24/120.28  b0100001(bitIndex5).
% 121.24/120.28  -b0100001(bitIndex4).
% 121.24/120.28  -b0100001(bitIndex3).
% 121.24/120.28  -b0100001(bitIndex2).
% 121.24/120.28  -b0100001(bitIndex1).
% 121.24/120.28  b0100001(bitIndex0).
% 121.24/120.28  all VarCurr (v5096(VarCurr)<->v5097(VarCurr)|v5100(VarCurr)).
% 121.24/120.28  all VarCurr (v5100(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$T)).
% 121.24/120.28  all VarCurr (v5097(VarCurr)<->v5098(VarCurr)|v5099(VarCurr)).
% 121.24/120.28  all VarCurr (v5099(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$T)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr (v5098(VarCurr)<-> (v2339(VarCurr,bitIndex6)<->$F)& (v2339(VarCurr,bitIndex5)<->$F)& (v2339(VarCurr,bitIndex4)<->$F)& (v2339(VarCurr,bitIndex3)<->$F)& (v2339(VarCurr,bitIndex2)<->$F)& (v2339(VarCurr,bitIndex1)<->$F)& (v2339(VarCurr,bitIndex0)<->$F)).
% 121.24/120.28  all VarCurr ((v2339(VarCurr,bitIndex6)<->v2149(VarCurr,bitIndex130))& (v2339(VarCurr,bitIndex5)<->v2149(VarCurr,bitIndex129))& (v2339(VarCurr,bitIndex4)<->v2149(VarCurr,bitIndex128))& (v2339(VarCurr,bitIndex3)<->v2149(VarCurr,bitIndex127))& (v2339(VarCurr,bitIndex2)<->v2149(VarCurr,bitIndex126))& (v2339(VarCurr,bitIndex1)<->v2149(VarCurr,bitIndex125))& (v2339(VarCurr,bitIndex0)<->v2149(VarCurr,bitIndex124))).
% 121.24/120.28  all VarCurr B (range_130_124(B)-> (v2149(VarCurr,B)<->v2151(VarCurr,B))).
% 121.24/120.28  all VarCurr ((v2151(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex523))& (v2151(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex522))& (v2151(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex521))& (v2151(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex520))& (v2151(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex519))& (v2151(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex518))& (v2151(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex517))).
% 121.24/120.28  all VarNext ((v2153(VarNext,bitIndex523)<->v5085(VarNext,bitIndex130))& (v2153(VarNext,bitIndex522)<->v5085(VarNext,bitIndex129))& (v2153(VarNext,bitIndex521)<->v5085(VarNext,bitIndex128))& (v2153(VarNext,bitIndex520)<->v5085(VarNext,bitIndex127))& (v2153(VarNext,bitIndex519)<->v5085(VarNext,bitIndex126))& (v2153(VarNext,bitIndex518)<->v5085(VarNext,bitIndex125))& (v2153(VarNext,bitIndex517)<->v5085(VarNext,bitIndex124))).
% 121.24/120.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5087(VarNext)-> (v5085(VarNext,bitIndex130)<->v2153(VarCurr,bitIndex523))& (v5085(VarNext,bitIndex129)<->v2153(VarCurr,bitIndex522))& (v5085(VarNext,bitIndex128)<->v2153(VarCurr,bitIndex521))& (v5085(VarNext,bitIndex127)<->v2153(VarCurr,bitIndex520))& (v5085(VarNext,bitIndex126)<->v2153(VarCurr,bitIndex519))& (v5085(VarNext,bitIndex125)<->v2153(VarCurr,bitIndex518))& (v5085(VarNext,bitIndex124)<->v2153(VarCurr,bitIndex517))& (v5085(VarNext,bitIndex123)<->v2153(VarCurr,bitIndex516))& (v5085(VarNext,bitIndex122)<->v2153(VarCurr,bitIndex515))& (v5085(VarNext,bitIndex121)<->v2153(VarCurr,bitIndex514))& (v5085(VarNext,bitIndex120)<->v2153(VarCurr,bitIndex513))& (v5085(VarNext,bitIndex119)<->v2153(VarCurr,bitIndex512))& (v5085(VarNext,bitIndex118)<->v2153(VarCurr,bitIndex511))& (v5085(VarNext,bitIndex117)<->v2153(VarCurr,bitIndex510))& (v5085(VarNext,bitIndex116)<->v2153(VarCurr,bitIndex509))& (v5085(VarNext,bitIndex115)<->v2153(VarCurr,bitIndex508))& (v5085(VarNext,bitIndex114)<->v2153(VarCurr,bitIndex507))& (v5085(VarNext,bitIndex113)<->v2153(VarCurr,bitIndex506))& (v5085(VarNext,bitIndex112)<->v2153(VarCurr,bitIndex505))& (v5085(VarNext,bitIndex111)<->v2153(VarCurr,bitIndex504))& (v5085(VarNext,bitIndex110)<->v2153(VarCurr,bitIndex503))& (v5085(VarNext,bitIndex109)<->v2153(VarCurr,bitIndex502))& (v5085(VarNext,bitIndex108)<->v2153(VarCurr,bitIndex501))& (v5085(VarNext,bitIndex107)<->v2153(VarCurr,bitIndex500))& (v5085(VarNext,bitIndex106)<->v2153(VarCurr,bitIndex499))& (v5085(VarNext,bitIndex105)<->v2153(VarCurr,bitIndex498))& (v5085(VarNext,bitIndex104)<->v2153(VarCurr,bitIndex497))& (v5085(VarNext,bitIndex103)<->v2153(VarCurr,bitIndex496))& (v5085(VarNext,bitIndex102)<->v2153(VarCurr,bitIndex495))& (v5085(VarNext,bitIndex101)<->v2153(VarCurr,bitIndex494))& (v5085(VarNext,bitIndex100)<->v2153(VarCurr,bitIndex493))& (v5085(VarNext,bitIndex99)<->v2153(VarCurr,bitIndex492))& (v5085(VarNext,bitIndex98)<->v2153(VarCurr,bitIndex491))& (v5085(VarNext,bitIndex97)<->v2153(VarCurr,bitIndex490))& (v5085(VarNext,bitIndex96)<->v2153(VarCurr,bitIndex489))& (v5085(VarNext,bitIndex95)<->v2153(VarCurr,bitIndex488))& (v5085(VarNext,bitIndex94)<->v2153(VarCurr,bitIndex487))& (v5085(VarNext,bitIndex93)<->v2153(VarCurr,bitIndex486))& (v5085(VarNext,bitIndex92)<->v2153(VarCurr,bitIndex485))& (v5085(VarNext,bitIndex91)<->v2153(VarCurr,bitIndex484))& (v5085(VarNext,bitIndex90)<->v2153(VarCurr,bitIndex483))& (v5085(VarNext,bitIndex89)<->v2153(VarCurr,bitIndex482))& (v5085(VarNext,bitIndex88)<->v2153(VarCurr,bitIndex481))& (v5085(VarNext,bitIndex87)<->v2153(VarCurr,bitIndex480))& (v5085(VarNext,bitIndex86)<->v2153(VarCurr,bitIndex479))& (v5085(VarNext,bitIndex85)<->v2153(VarCurr,bitIndex478))& (v5085(VarNext,bitIndex84)<->v2153(VarCurr,bitIndex477))& (v5085(VarNext,bitIndex83)<->v2153(VarCurr,bitIndex476))& (v5085(VarNext,bitIndex82)<->v2153(VarCurr,bitIndex475))& (v5085(VarNext,bitIndex81)<->v2153(VarCurr,bitIndex474))& (v5085(VarNext,bitIndex80)<->v2153(VarCurr,bitIndex473))& (v5085(VarNext,bitIndex79)<->v2153(VarCurr,bitIndex472))& (v5085(VarNext,bitIndex78)<->v2153(VarCurr,bitIndex471))& (v5085(VarNext,bitIndex77)<->v2153(VarCurr,bitIndex470))& (v5085(VarNext,bitIndex76)<->v2153(VarCurr,bitIndex469))& (v5085(VarNext,bitIndex75)<->v2153(VarCurr,bitIndex468))& (v5085(VarNext,bitIndex74)<->v2153(VarCurr,bitIndex467))& (v5085(VarNext,bitIndex73)<->v2153(VarCurr,bitIndex466))& (v5085(VarNext,bitIndex72)<->v2153(VarCurr,bitIndex465))& (v5085(VarNext,bitIndex71)<->v2153(VarCurr,bitIndex464))& (v5085(VarNext,bitIndex70)<->v2153(VarCurr,bitIndex463))& (v5085(VarNext,bitIndex69)<->v2153(VarCurr,bitIndex462))& (v5085(VarNext,bitIndex68)<->v2153(VarCurr,bitIndex461))& (v5085(VarNext,bitIndex67)<->v2153(VarCurr,bitIndex460))& (v5085(VarNext,bitIndex66)<->v2153(VarCurr,bitIndex459))& (v5085(VarNext,bitIndex65)<->v2153(VarCurr,bitIndex458))& (v5085(VarNext,bitIndex64)<->v2153(VarCurr,bitIndex457))& (v5085(VarNext,bitIndex63)<->v2153(VarCurr,bitIndex456))& (v5085(VarNext,bitIndex62)<->v2153(VarCurr,bitIndex455))& (v5085(VarNext,bitIndex61)<->v2153(VarCurr,bitIndex454))& (v5085(VarNext,bitIndex60)<->v2153(VarCurr,bitIndex453))& (v5085(VarNext,bitIndex59)<->v2153(VarCurr,bitIndex452))& (v5085(VarNext,bitIndex58)<->v2153(VarCurr,bitIndex451))& (v5085(VarNext,bitIndex57)<->v2153(VarCurr,bitIndex450))& (v5085(VarNext,bitIndex56)<->v2153(VarCurr,bitIndex449))& (v5085(VarNext,bitIndex55)<->v2153(VarCurr,bitIndex448))& (v5085(VarNext,bitIndex54)<->v2153(VarCurr,bitIndex447))& (v5085(VarNext,bitIndex53)<->v2153(VarCurr,bitIndex446))& (v5085(VarNext,bitIndex52)<->v2153(VarCurr,bitIndex445))& (v5085(VarNext,bitIndex51)<->v2153(VarCurr,bitIndex444))& (v5085(VarNext,bitIndex50)<->v2153(VarCurr,bitIndex443))& (v5085(VarNext,bitIndex49)<->v2153(VarCurr,bitIndex442))& (v5085(VarNext,bitIndex48)<->v2153(VarCurr,bitIndex441))& (v5085(VarNext,bitIndex47)<->v2153(VarCurr,bitIndex440))& (v5085(VarNext,bitIndex46)<->v2153(VarCurr,bitIndex439))& (v5085(VarNext,bitIndex45)<->v2153(VarCurr,bitIndex438))& (v5085(VarNext,bitIndex44)<->v2153(VarCurr,bitIndex437))& (v5085(VarNext,bitIndex43)<->v2153(VarCurr,bitIndex436))& (v5085(VarNext,bitIndex42)<->v2153(VarCurr,bitIndex435))& (v5085(VarNext,bitIndex41)<->v2153(VarCurr,bitIndex434))& (v5085(VarNext,bitIndex40)<->v2153(VarCurr,bitIndex433))& (v5085(VarNext,bitIndex39)<->v2153(VarCurr,bitIndex432))& (v5085(VarNext,bitIndex38)<->v2153(VarCurr,bitIndex431))& (v5085(VarNext,bitIndex37)<->v2153(VarCurr,bitIndex430))& (v5085(VarNext,bitIndex36)<->v2153(VarCurr,bitIndex429))& (v5085(VarNext,bitIndex35)<->v2153(VarCurr,bitIndex428))& (v5085(VarNext,bitIndex34)<->v2153(VarCurr,bitIndex427))& (v5085(VarNext,bitIndex33)<->v2153(VarCurr,bitIndex426))& (v5085(VarNext,bitIndex32)<->v2153(VarCurr,bitIndex425))& (v5085(VarNext,bitIndex31)<->v2153(VarCurr,bitIndex424))& (v5085(VarNext,bitIndex30)<->v2153(VarCurr,bitIndex423))& (v5085(VarNext,bitIndex29)<->v2153(VarCurr,bitIndex422))& (v5085(VarNext,bitIndex28)<->v2153(VarCurr,bitIndex421))& (v5085(VarNext,bitIndex27)<->v2153(VarCurr,bitIndex420))& (v5085(VarNext,bitIndex26)<->v2153(VarCurr,bitIndex419))& (v5085(VarNext,bitIndex25)<->v2153(VarCurr,bitIndex418))& (v5085(VarNext,bitIndex24)<->v2153(VarCurr,bitIndex417))& (v5085(VarNext,bitIndex23)<->v2153(VarCurr,bitIndex416))& (v5085(VarNext,bitIndex22)<->v2153(VarCurr,bitIndex415))& (v5085(VarNext,bitIndex21)<->v2153(VarCurr,bitIndex414))& (v5085(VarNext,bitIndex20)<->v2153(VarCurr,bitIndex413))& (v5085(VarNext,bitIndex19)<->v2153(VarCurr,bitIndex412))& (v5085(VarNext,bitIndex18)<->v2153(VarCurr,bitIndex411))& (v5085(VarNext,bitIndex17)<->v2153(VarCurr,bitIndex410))& (v5085(VarNext,bitIndex16)<->v2153(VarCurr,bitIndex409))& (v5085(VarNext,bitIndex15)<->v2153(VarCurr,bitIndex408))& (v5085(VarNext,bitIndex14)<->v2153(VarCurr,bitIndex407))& (v5085(VarNext,bitIndex13)<->v2153(VarCurr,bitIndex406))& (v5085(VarNext,bitIndex12)<->v2153(VarCurr,bitIndex405))& (v5085(VarNext,bitIndex11)<->v2153(VarCurr,bitIndex404))& (v5085(VarNext,bitIndex10)<->v2153(VarCurr,bitIndex403))& (v5085(VarNext,bitIndex9)<->v2153(VarCurr,bitIndex402))& (v5085(VarNext,bitIndex8)<->v2153(VarCurr,bitIndex401))& (v5085(VarNext,bitIndex7)<->v2153(VarCurr,bitIndex400))& (v5085(VarNext,bitIndex6)<->v2153(VarCurr,bitIndex399))& (v5085(VarNext,bitIndex5)<->v2153(VarCurr,bitIndex398))& (v5085(VarNext,bitIndex4)<->v2153(VarCurr,bitIndex397))& (v5085(VarNext,bitIndex3)<->v2153(VarCurr,bitIndex396))& (v5085(VarNext,bitIndex2)<->v2153(VarCurr,bitIndex395))& (v5085(VarNext,bitIndex1)<->v2153(VarCurr,bitIndex394))& (v5085(VarNext,bitIndex0)<->v2153(VarCurr,bitIndex393)))).
% 121.24/120.28  all VarNext (v5087(VarNext)-> (all B (range_130_0(B)-> (v5085(VarNext,B)<->v2292(VarNext,B))))).
% 121.24/120.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5087(VarNext)<->v5088(VarNext)&v2273(VarNext))).
% 121.24/120.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5088(VarNext)<->v5090(VarNext)&v2173(VarNext))).
% 121.24/120.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5090(VarNext)<->v2182(VarNext))).
% 121.24/120.28  all VarCurr B (range_130_124(B)-> (v2257(VarCurr,B)<->v2262(VarCurr,B))).
% 121.24/120.28  all VarCurr ((v2261(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex392))& (v2261(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex391))& (v2261(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex390))& (v2261(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex389))& (v2261(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex388))& (v2261(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex387))& (v2261(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex386))).
% 121.24/120.28  all VarCurr B (range_130_124(B)-> (v2163(VarCurr,B)<->v2255(VarCurr,B))).
% 121.24/120.28  all VarCurr ((v2246(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex523))& (v2246(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex522))& (v2246(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex521))& (v2246(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex520))& (v2246(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex519))& (v2246(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex518))& (v2246(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex517))).
% 121.24/120.29  all VarNext ((v2153(VarNext,bitIndex392)<->v5053(VarNext,bitIndex130))& (v2153(VarNext,bitIndex391)<->v5053(VarNext,bitIndex129))& (v2153(VarNext,bitIndex390)<->v5053(VarNext,bitIndex128))& (v2153(VarNext,bitIndex389)<->v5053(VarNext,bitIndex127))& (v2153(VarNext,bitIndex388)<->v5053(VarNext,bitIndex126))& (v2153(VarNext,bitIndex387)<->v5053(VarNext,bitIndex125))& (v2153(VarNext,bitIndex386)<->v5053(VarNext,bitIndex124))).
% 121.24/120.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5054(VarNext)-> (v5053(VarNext,bitIndex130)<->v2153(VarCurr,bitIndex392))& (v5053(VarNext,bitIndex129)<->v2153(VarCurr,bitIndex391))& (v5053(VarNext,bitIndex128)<->v2153(VarCurr,bitIndex390))& (v5053(VarNext,bitIndex127)<->v2153(VarCurr,bitIndex389))& (v5053(VarNext,bitIndex126)<->v2153(VarCurr,bitIndex388))& (v5053(VarNext,bitIndex125)<->v2153(VarCurr,bitIndex387))& (v5053(VarNext,bitIndex124)<->v2153(VarCurr,bitIndex386))& (v5053(VarNext,bitIndex123)<->v2153(VarCurr,bitIndex385))& (v5053(VarNext,bitIndex122)<->v2153(VarCurr,bitIndex384))& (v5053(VarNext,bitIndex121)<->v2153(VarCurr,bitIndex383))& (v5053(VarNext,bitIndex120)<->v2153(VarCurr,bitIndex382))& (v5053(VarNext,bitIndex119)<->v2153(VarCurr,bitIndex381))& (v5053(VarNext,bitIndex118)<->v2153(VarCurr,bitIndex380))& (v5053(VarNext,bitIndex117)<->v2153(VarCurr,bitIndex379))& (v5053(VarNext,bitIndex116)<->v2153(VarCurr,bitIndex378))& (v5053(VarNext,bitIndex115)<->v2153(VarCurr,bitIndex377))& (v5053(VarNext,bitIndex114)<->v2153(VarCurr,bitIndex376))& (v5053(VarNext,bitIndex113)<->v2153(VarCurr,bitIndex375))& (v5053(VarNext,bitIndex112)<->v2153(VarCurr,bitIndex374))& (v5053(VarNext,bitIndex111)<->v2153(VarCurr,bitIndex373))& (v5053(VarNext,bitIndex110)<->v2153(VarCurr,bitIndex372))& (v5053(VarNext,bitIndex109)<->v2153(VarCurr,bitIndex371))& (v5053(VarNext,bitIndex108)<->v2153(VarCurr,bitIndex370))& (v5053(VarNext,bitIndex107)<->v2153(VarCurr,bitIndex369))& (v5053(VarNext,bitIndex106)<->v2153(VarCurr,bitIndex368))& (v5053(VarNext,bitIndex105)<->v2153(VarCurr,bitIndex367))& (v5053(VarNext,bitIndex104)<->v2153(VarCurr,bitIndex366))& (v5053(VarNext,bitIndex103)<->v2153(VarCurr,bitIndex365))& (v5053(VarNext,bitIndex102)<->v2153(VarCurr,bitIndex364))& (v5053(VarNext,bitIndex101)<->v2153(VarCurr,bitIndex363))& (v5053(VarNext,bitIndex100)<->v2153(VarCurr,bitIndex362))& (v5053(VarNext,bitIndex99)<->v2153(VarCurr,bitIndex361))& (v5053(VarNext,bitIndex98)<->v2153(VarCurr,bitIndex360))& (v5053(VarNext,bitIndex97)<->v2153(VarCurr,bitIndex359))& (v5053(VarNext,bitIndex96)<->v2153(VarCurr,bitIndex358))& (v5053(VarNext,bitIndex95)<->v2153(VarCurr,bitIndex357))& (v5053(VarNext,bitIndex94)<->v2153(VarCurr,bitIndex356))& (v5053(VarNext,bitIndex93)<->v2153(VarCurr,bitIndex355))& (v5053(VarNext,bitIndex92)<->v2153(VarCurr,bitIndex354))& (v5053(VarNext,bitIndex91)<->v2153(VarCurr,bitIndex353))& (v5053(VarNext,bitIndex90)<->v2153(VarCurr,bitIndex352))& (v5053(VarNext,bitIndex89)<->v2153(VarCurr,bitIndex351))& (v5053(VarNext,bitIndex88)<->v2153(VarCurr,bitIndex350))& (v5053(VarNext,bitIndex87)<->v2153(VarCurr,bitIndex349))& (v5053(VarNext,bitIndex86)<->v2153(VarCurr,bitIndex348))& (v5053(VarNext,bitIndex85)<->v2153(VarCurr,bitIndex347))& (v5053(VarNext,bitIndex84)<->v2153(VarCurr,bitIndex346))& (v5053(VarNext,bitIndex83)<->v2153(VarCurr,bitIndex345))& (v5053(VarNext,bitIndex82)<->v2153(VarCurr,bitIndex344))& (v5053(VarNext,bitIndex81)<->v2153(VarCurr,bitIndex343))& (v5053(VarNext,bitIndex80)<->v2153(VarCurr,bitIndex342))& (v5053(VarNext,bitIndex79)<->v2153(VarCurr,bitIndex341))& (v5053(VarNext,bitIndex78)<->v2153(VarCurr,bitIndex340))& (v5053(VarNext,bitIndex77)<->v2153(VarCurr,bitIndex339))& (v5053(VarNext,bitIndex76)<->v2153(VarCurr,bitIndex338))& (v5053(VarNext,bitIndex75)<->v2153(VarCurr,bitIndex337))& (v5053(VarNext,bitIndex74)<->v2153(VarCurr,bitIndex336))& (v5053(VarNext,bitIndex73)<->v2153(VarCurr,bitIndex335))& (v5053(VarNext,bitIndex72)<->v2153(VarCurr,bitIndex334))& (v5053(VarNext,bitIndex71)<->v2153(VarCurr,bitIndex333))& (v5053(VarNext,bitIndex70)<->v2153(VarCurr,bitIndex332))& (v5053(VarNext,bitIndex69)<->v2153(VarCurr,bitIndex331))& (v5053(VarNext,bitIndex68)<->v2153(VarCurr,bitIndex330))& (v5053(VarNext,bitIndex67)<->v2153(VarCurr,bitIndex329))& (v5053(VarNext,bitIndex66)<->v2153(VarCurr,bitIndex328))& (v5053(VarNext,bitIndex65)<->v2153(VarCurr,bitIndex327))& (v5053(VarNext,bitIndex64)<->v2153(VarCurr,bitIndex326))& (v5053(VarNext,bitIndex63)<->v2153(VarCurr,bitIndex325))& (v5053(VarNext,bitIndex62)<->v2153(VarCurr,bitIndex324))& (v5053(VarNext,bitIndex61)<->v2153(VarCurr,bitIndex323))& (v5053(VarNext,bitIndex60)<->v2153(VarCurr,bitIndex322))& (v5053(VarNext,bitIndex59)<->v2153(VarCurr,bitIndex321))& (v5053(VarNext,bitIndex58)<->v2153(VarCurr,bitIndex320))& (v5053(VarNext,bitIndex57)<->v2153(VarCurr,bitIndex319))& (v5053(VarNext,bitIndex56)<->v2153(VarCurr,bitIndex318))& (v5053(VarNext,bitIndex55)<->v2153(VarCurr,bitIndex317))& (v5053(VarNext,bitIndex54)<->v2153(VarCurr,bitIndex316))& (v5053(VarNext,bitIndex53)<->v2153(VarCurr,bitIndex315))& (v5053(VarNext,bitIndex52)<->v2153(VarCurr,bitIndex314))& (v5053(VarNext,bitIndex51)<->v2153(VarCurr,bitIndex313))& (v5053(VarNext,bitIndex50)<->v2153(VarCurr,bitIndex312))& (v5053(VarNext,bitIndex49)<->v2153(VarCurr,bitIndex311))& (v5053(VarNext,bitIndex48)<->v2153(VarCurr,bitIndex310))& (v5053(VarNext,bitIndex47)<->v2153(VarCurr,bitIndex309))& (v5053(VarNext,bitIndex46)<->v2153(VarCurr,bitIndex308))& (v5053(VarNext,bitIndex45)<->v2153(VarCurr,bitIndex307))& (v5053(VarNext,bitIndex44)<->v2153(VarCurr,bitIndex306))& (v5053(VarNext,bitIndex43)<->v2153(VarCurr,bitIndex305))& (v5053(VarNext,bitIndex42)<->v2153(VarCurr,bitIndex304))& (v5053(VarNext,bitIndex41)<->v2153(VarCurr,bitIndex303))& (v5053(VarNext,bitIndex40)<->v2153(VarCurr,bitIndex302))& (v5053(VarNext,bitIndex39)<->v2153(VarCurr,bitIndex301))& (v5053(VarNext,bitIndex38)<->v2153(VarCurr,bitIndex300))& (v5053(VarNext,bitIndex37)<->v2153(VarCurr,bitIndex299))& (v5053(VarNext,bitIndex36)<->v2153(VarCurr,bitIndex298))& (v5053(VarNext,bitIndex35)<->v2153(VarCurr,bitIndex297))& (v5053(VarNext,bitIndex34)<->v2153(VarCurr,bitIndex296))& (v5053(VarNext,bitIndex33)<->v2153(VarCurr,bitIndex295))& (v5053(VarNext,bitIndex32)<->v2153(VarCurr,bitIndex294))& (v5053(VarNext,bitIndex31)<->v2153(VarCurr,bitIndex293))& (v5053(VarNext,bitIndex30)<->v2153(VarCurr,bitIndex292))& (v5053(VarNext,bitIndex29)<->v2153(VarCurr,bitIndex291))& (v5053(VarNext,bitIndex28)<->v2153(VarCurr,bitIndex290))& (v5053(VarNext,bitIndex27)<->v2153(VarCurr,bitIndex289))& (v5053(VarNext,bitIndex26)<->v2153(VarCurr,bitIndex288))& (v5053(VarNext,bitIndex25)<->v2153(VarCurr,bitIndex287))& (v5053(VarNext,bitIndex24)<->v2153(VarCurr,bitIndex286))& (v5053(VarNext,bitIndex23)<->v2153(VarCurr,bitIndex285))& (v5053(VarNext,bitIndex22)<->v2153(VarCurr,bitIndex284))& (v5053(VarNext,bitIndex21)<->v2153(VarCurr,bitIndex283))& (v5053(VarNext,bitIndex20)<->v2153(VarCurr,bitIndex282))& (v5053(VarNext,bitIndex19)<->v2153(VarCurr,bitIndex281))& (v5053(VarNext,bitIndex18)<->v2153(VarCurr,bitIndex280))& (v5053(VarNext,bitIndex17)<->v2153(VarCurr,bitIndex279))& (v5053(VarNext,bitIndex16)<->v2153(VarCurr,bitIndex278))& (v5053(VarNext,bitIndex15)<->v2153(VarCurr,bitIndex277))& (v5053(VarNext,bitIndex14)<->v2153(VarCurr,bitIndex276))& (v5053(VarNext,bitIndex13)<->v2153(VarCurr,bitIndex275))& (v5053(VarNext,bitIndex12)<->v2153(VarCurr,bitIndex274))& (v5053(VarNext,bitIndex11)<->v2153(VarCurr,bitIndex273))& (v5053(VarNext,bitIndex10)<->v2153(VarCurr,bitIndex272))& (v5053(VarNext,bitIndex9)<->v2153(VarCurr,bitIndex271))& (v5053(VarNext,bitIndex8)<->v2153(VarCurr,bitIndex270))& (v5053(VarNext,bitIndex7)<->v2153(VarCurr,bitIndex269))& (v5053(VarNext,bitIndex6)<->v2153(VarCurr,bitIndex268))& (v5053(VarNext,bitIndex5)<->v2153(VarCurr,bitIndex267))& (v5053(VarNext,bitIndex4)<->v2153(VarCurr,bitIndex266))& (v5053(VarNext,bitIndex3)<->v2153(VarCurr,bitIndex265))& (v5053(VarNext,bitIndex2)<->v2153(VarCurr,bitIndex264))& (v5053(VarNext,bitIndex1)<->v2153(VarCurr,bitIndex263))& (v5053(VarNext,bitIndex0)<->v2153(VarCurr,bitIndex262)))).
% 121.33/120.30  all VarNext (v5054(VarNext)-> (all B (range_130_0(B)-> (v5053(VarNext,B)<->v5080(VarNext,B))))).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v5080(VarNext,B)<->v5078(VarCurr,B))))).
% 121.33/120.30  all VarCurr (-v2275(VarCurr)-> (all B (range_130_0(B)-> (v5078(VarCurr,B)<->v5081(VarCurr,B))))).
% 121.33/120.30  all VarCurr (v2275(VarCurr)-> (all B (range_130_0(B)-> (v5078(VarCurr,B)<->$F)))).
% 121.33/120.30  all VarCurr (-v5067(VarCurr)& -v5069(VarCurr)-> (all B (range_130_0(B)-> (v5081(VarCurr,B)<->v5046(VarCurr,B))))).
% 121.33/120.30  all VarCurr (v5069(VarCurr)-> (all B (range_130_0(B)-> (v5081(VarCurr,B)<->v5039(VarCurr,B))))).
% 121.33/120.30  all VarCurr (v5067(VarCurr)-> (v5081(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex261))& (v5081(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex260))& (v5081(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex259))& (v5081(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex258))& (v5081(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex257))& (v5081(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex256))& (v5081(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex255))& (v5081(VarCurr,bitIndex123)<->v2153(VarCurr,bitIndex254))& (v5081(VarCurr,bitIndex122)<->v2153(VarCurr,bitIndex253))& (v5081(VarCurr,bitIndex121)<->v2153(VarCurr,bitIndex252))& (v5081(VarCurr,bitIndex120)<->v2153(VarCurr,bitIndex251))& (v5081(VarCurr,bitIndex119)<->v2153(VarCurr,bitIndex250))& (v5081(VarCurr,bitIndex118)<->v2153(VarCurr,bitIndex249))& (v5081(VarCurr,bitIndex117)<->v2153(VarCurr,bitIndex248))& (v5081(VarCurr,bitIndex116)<->v2153(VarCurr,bitIndex247))& (v5081(VarCurr,bitIndex115)<->v2153(VarCurr,bitIndex246))& (v5081(VarCurr,bitIndex114)<->v2153(VarCurr,bitIndex245))& (v5081(VarCurr,bitIndex113)<->v2153(VarCurr,bitIndex244))& (v5081(VarCurr,bitIndex112)<->v2153(VarCurr,bitIndex243))& (v5081(VarCurr,bitIndex111)<->v2153(VarCurr,bitIndex242))& (v5081(VarCurr,bitIndex110)<->v2153(VarCurr,bitIndex241))& (v5081(VarCurr,bitIndex109)<->v2153(VarCurr,bitIndex240))& (v5081(VarCurr,bitIndex108)<->v2153(VarCurr,bitIndex239))& (v5081(VarCurr,bitIndex107)<->v2153(VarCurr,bitIndex238))& (v5081(VarCurr,bitIndex106)<->v2153(VarCurr,bitIndex237))& (v5081(VarCurr,bitIndex105)<->v2153(VarCurr,bitIndex236))& (v5081(VarCurr,bitIndex104)<->v2153(VarCurr,bitIndex235))& (v5081(VarCurr,bitIndex103)<->v2153(VarCurr,bitIndex234))& (v5081(VarCurr,bitIndex102)<->v2153(VarCurr,bitIndex233))& (v5081(VarCurr,bitIndex101)<->v2153(VarCurr,bitIndex232))& (v5081(VarCurr,bitIndex100)<->v2153(VarCurr,bitIndex231))& (v5081(VarCurr,bitIndex99)<->v2153(VarCurr,bitIndex230))& (v5081(VarCurr,bitIndex98)<->v2153(VarCurr,bitIndex229))& (v5081(VarCurr,bitIndex97)<->v2153(VarCurr,bitIndex228))& (v5081(VarCurr,bitIndex96)<->v2153(VarCurr,bitIndex227))& (v5081(VarCurr,bitIndex95)<->v2153(VarCurr,bitIndex226))& (v5081(VarCurr,bitIndex94)<->v2153(VarCurr,bitIndex225))& (v5081(VarCurr,bitIndex93)<->v2153(VarCurr,bitIndex224))& (v5081(VarCurr,bitIndex92)<->v2153(VarCurr,bitIndex223))& (v5081(VarCurr,bitIndex91)<->v2153(VarCurr,bitIndex222))& (v5081(VarCurr,bitIndex90)<->v2153(VarCurr,bitIndex221))& (v5081(VarCurr,bitIndex89)<->v2153(VarCurr,bitIndex220))& (v5081(VarCurr,bitIndex88)<->v2153(VarCurr,bitIndex219))& (v5081(VarCurr,bitIndex87)<->v2153(VarCurr,bitIndex218))& (v5081(VarCurr,bitIndex86)<->v2153(VarCurr,bitIndex217))& (v5081(VarCurr,bitIndex85)<->v2153(VarCurr,bitIndex216))& (v5081(VarCurr,bitIndex84)<->v2153(VarCurr,bitIndex215))& (v5081(VarCurr,bitIndex83)<->v2153(VarCurr,bitIndex214))& (v5081(VarCurr,bitIndex82)<->v2153(VarCurr,bitIndex213))& (v5081(VarCurr,bitIndex81)<->v2153(VarCurr,bitIndex212))& (v5081(VarCurr,bitIndex80)<->v2153(VarCurr,bitIndex211))& (v5081(VarCurr,bitIndex79)<->v2153(VarCurr,bitIndex210))& (v5081(VarCurr,bitIndex78)<->v2153(VarCurr,bitIndex209))& (v5081(VarCurr,bitIndex77)<->v2153(VarCurr,bitIndex208))& (v5081(VarCurr,bitIndex76)<->v2153(VarCurr,bitIndex207))& (v5081(VarCurr,bitIndex75)<->v2153(VarCurr,bitIndex206))& (v5081(VarCurr,bitIndex74)<->v2153(VarCurr,bitIndex205))& (v5081(VarCurr,bitIndex73)<->v2153(VarCurr,bitIndex204))& (v5081(VarCurr,bitIndex72)<->v2153(VarCurr,bitIndex203))& (v5081(VarCurr,bitIndex71)<->v2153(VarCurr,bitIndex202))& (v5081(VarCurr,bitIndex70)<->v2153(VarCurr,bitIndex201))& (v5081(VarCurr,bitIndex69)<->v2153(VarCurr,bitIndex200))& (v5081(VarCurr,bitIndex68)<->v2153(VarCurr,bitIndex199))& (v5081(VarCurr,bitIndex67)<->v2153(VarCurr,bitIndex198))& (v5081(VarCurr,bitIndex66)<->v2153(VarCurr,bitIndex197))& (v5081(VarCurr,bitIndex65)<->v2153(VarCurr,bitIndex196))& (v5081(VarCurr,bitIndex64)<->v2153(VarCurr,bitIndex195))& (v5081(VarCurr,bitIndex63)<->v2153(VarCurr,bitIndex194))& (v5081(VarCurr,bitIndex62)<->v2153(VarCurr,bitIndex193))& (v5081(VarCurr,bitIndex61)<->v2153(VarCurr,bitIndex192))& (v5081(VarCurr,bitIndex60)<->v2153(VarCurr,bitIndex191))& (v5081(VarCurr,bitIndex59)<->v2153(VarCurr,bitIndex190))& (v5081(VarCurr,bitIndex58)<->v2153(VarCurr,bitIndex189))& (v5081(VarCurr,bitIndex57)<->v2153(VarCurr,bitIndex188))& (v5081(VarCurr,bitIndex56)<->v2153(VarCurr,bitIndex187))& (v5081(VarCurr,bitIndex55)<->v2153(VarCurr,bitIndex186))& (v5081(VarCurr,bitIndex54)<->v2153(VarCurr,bitIndex185))& (v5081(VarCurr,bitIndex53)<->v2153(VarCurr,bitIndex184))& (v5081(VarCurr,bitIndex52)<->v2153(VarCurr,bitIndex183))& (v5081(VarCurr,bitIndex51)<->v2153(VarCurr,bitIndex182))& (v5081(VarCurr,bitIndex50)<->v2153(VarCurr,bitIndex181))& (v5081(VarCurr,bitIndex49)<->v2153(VarCurr,bitIndex180))& (v5081(VarCurr,bitIndex48)<->v2153(VarCurr,bitIndex179))& (v5081(VarCurr,bitIndex47)<->v2153(VarCurr,bitIndex178))& (v5081(VarCurr,bitIndex46)<->v2153(VarCurr,bitIndex177))& (v5081(VarCurr,bitIndex45)<->v2153(VarCurr,bitIndex176))& (v5081(VarCurr,bitIndex44)<->v2153(VarCurr,bitIndex175))& (v5081(VarCurr,bitIndex43)<->v2153(VarCurr,bitIndex174))& (v5081(VarCurr,bitIndex42)<->v2153(VarCurr,bitIndex173))& (v5081(VarCurr,bitIndex41)<->v2153(VarCurr,bitIndex172))& (v5081(VarCurr,bitIndex40)<->v2153(VarCurr,bitIndex171))& (v5081(VarCurr,bitIndex39)<->v2153(VarCurr,bitIndex170))& (v5081(VarCurr,bitIndex38)<->v2153(VarCurr,bitIndex169))& (v5081(VarCurr,bitIndex37)<->v2153(VarCurr,bitIndex168))& (v5081(VarCurr,bitIndex36)<->v2153(VarCurr,bitIndex167))& (v5081(VarCurr,bitIndex35)<->v2153(VarCurr,bitIndex166))& (v5081(VarCurr,bitIndex34)<->v2153(VarCurr,bitIndex165))& (v5081(VarCurr,bitIndex33)<->v2153(VarCurr,bitIndex164))& (v5081(VarCurr,bitIndex32)<->v2153(VarCurr,bitIndex163))& (v5081(VarCurr,bitIndex31)<->v2153(VarCurr,bitIndex162))& (v5081(VarCurr,bitIndex30)<->v2153(VarCurr,bitIndex161))& (v5081(VarCurr,bitIndex29)<->v2153(VarCurr,bitIndex160))& (v5081(VarCurr,bitIndex28)<->v2153(VarCurr,bitIndex159))& (v5081(VarCurr,bitIndex27)<->v2153(VarCurr,bitIndex158))& (v5081(VarCurr,bitIndex26)<->v2153(VarCurr,bitIndex157))& (v5081(VarCurr,bitIndex25)<->v2153(VarCurr,bitIndex156))& (v5081(VarCurr,bitIndex24)<->v2153(VarCurr,bitIndex155))& (v5081(VarCurr,bitIndex23)<->v2153(VarCurr,bitIndex154))& (v5081(VarCurr,bitIndex22)<->v2153(VarCurr,bitIndex153))& (v5081(VarCurr,bitIndex21)<->v2153(VarCurr,bitIndex152))& (v5081(VarCurr,bitIndex20)<->v2153(VarCurr,bitIndex151))& (v5081(VarCurr,bitIndex19)<->v2153(VarCurr,bitIndex150))& (v5081(VarCurr,bitIndex18)<->v2153(VarCurr,bitIndex149))& (v5081(VarCurr,bitIndex17)<->v2153(VarCurr,bitIndex148))& (v5081(VarCurr,bitIndex16)<->v2153(VarCurr,bitIndex147))& (v5081(VarCurr,bitIndex15)<->v2153(VarCurr,bitIndex146))& (v5081(VarCurr,bitIndex14)<->v2153(VarCurr,bitIndex145))& (v5081(VarCurr,bitIndex13)<->v2153(VarCurr,bitIndex144))& (v5081(VarCurr,bitIndex12)<->v2153(VarCurr,bitIndex143))& (v5081(VarCurr,bitIndex11)<->v2153(VarCurr,bitIndex142))& (v5081(VarCurr,bitIndex10)<->v2153(VarCurr,bitIndex141))& (v5081(VarCurr,bitIndex9)<->v2153(VarCurr,bitIndex140))& (v5081(VarCurr,bitIndex8)<->v2153(VarCurr,bitIndex139))& (v5081(VarCurr,bitIndex7)<->v2153(VarCurr,bitIndex138))& (v5081(VarCurr,bitIndex6)<->v2153(VarCurr,bitIndex137))& (v5081(VarCurr,bitIndex5)<->v2153(VarCurr,bitIndex136))& (v5081(VarCurr,bitIndex4)<->v2153(VarCurr,bitIndex135))& (v5081(VarCurr,bitIndex3)<->v2153(VarCurr,bitIndex134))& (v5081(VarCurr,bitIndex2)<->v2153(VarCurr,bitIndex133))& (v5081(VarCurr,bitIndex1)<->v2153(VarCurr,bitIndex132))& (v5081(VarCurr,bitIndex0)<->v2153(VarCurr,bitIndex131))).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5054(VarNext)<->v5055(VarNext)&v5062(VarNext))).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5062(VarNext)<->v5060(VarCurr))).
% 121.33/120.30  all VarCurr (v5060(VarCurr)<->v5063(VarCurr)&v5074(VarCurr)).
% 121.33/120.30  all VarCurr (v5074(VarCurr)<->v5075(VarCurr)|v2275(VarCurr)).
% 121.33/120.30  all VarCurr (-v5075(VarCurr)<->v5076(VarCurr)).
% 121.33/120.30  all VarCurr (v5076(VarCurr)<-> (v5077(VarCurr,bitIndex1)<->$F)& (v5077(VarCurr,bitIndex0)<->$F)).
% 121.33/120.30  all VarCurr (v5077(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.30  all VarCurr (v5077(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.30  all VarCurr (v5063(VarCurr)<->v2275(VarCurr)|v5064(VarCurr)).
% 121.33/120.30  all VarCurr (v5064(VarCurr)<->v5065(VarCurr)&v5073(VarCurr)).
% 121.33/120.30  all VarCurr (-v5073(VarCurr)<->v2275(VarCurr)).
% 121.33/120.30  all VarCurr (v5065(VarCurr)<->v5066(VarCurr)|v5071(VarCurr)).
% 121.33/120.30  all VarCurr (v5071(VarCurr)<-> (v5072(VarCurr,bitIndex1)<->$T)& (v5072(VarCurr,bitIndex0)<->$T)).
% 121.33/120.30  all VarCurr (v5072(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.30  all VarCurr (v5072(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.30  all VarCurr (v5066(VarCurr)<->v5067(VarCurr)|v5069(VarCurr)).
% 121.33/120.30  all VarCurr (v5069(VarCurr)<-> (v5070(VarCurr,bitIndex1)<->$T)& (v5070(VarCurr,bitIndex0)<->$F)).
% 121.33/120.30  all VarCurr (v5070(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.30  all VarCurr (v5070(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.30  all VarCurr (v5067(VarCurr)<-> (v5068(VarCurr,bitIndex1)<->$F)& (v5068(VarCurr,bitIndex0)<->$T)).
% 121.33/120.30  all VarCurr (v5068(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.30  all VarCurr (v5068(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5055(VarNext)<->v5057(VarNext)&v2173(VarNext))).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5057(VarNext)<->v2182(VarNext))).
% 121.33/120.30  all VarCurr B (range_130_124(B)-> (v5046(VarCurr,B)<->v5051(VarCurr,B))).
% 121.33/120.30  all VarCurr (-v5048(VarCurr)-> (all B (range_130_0(B)-> (v5051(VarCurr,B)<->v5050(VarCurr,B))))).
% 121.33/120.30  all VarCurr (v5048(VarCurr)-> (all B (range_130_0(B)-> (v5051(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.33/120.30  all VarCurr ((v5050(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex261))& (v5050(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex260))& (v5050(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex259))& (v5050(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex258))& (v5050(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex257))& (v5050(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex256))& (v5050(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex255))).
% 121.33/120.30  all VarCurr (v5048(VarCurr)<->v2167(VarCurr,bitIndex2)).
% 121.33/120.30  all VarCurr B (range_130_124(B)-> (v5039(VarCurr,B)<->v5044(VarCurr,B))).
% 121.33/120.30  all VarCurr (-v5041(VarCurr)-> (all B (range_130_0(B)-> (v5044(VarCurr,B)<->v5043(VarCurr,B))))).
% 121.33/120.30  all VarCurr (v5041(VarCurr)-> (all B (range_130_0(B)-> (v5044(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.33/120.30  all VarCurr ((v5043(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex392))& (v5043(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex391))& (v5043(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex390))& (v5043(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex389))& (v5043(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex388))& (v5043(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex387))& (v5043(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex386))).
% 121.33/120.30  all VarCurr (v5041(VarCurr)<->v2167(VarCurr,bitIndex2)).
% 121.33/120.30  all VarNext ((v2153(VarNext,bitIndex261)<->v5007(VarNext,bitIndex130))& (v2153(VarNext,bitIndex260)<->v5007(VarNext,bitIndex129))& (v2153(VarNext,bitIndex259)<->v5007(VarNext,bitIndex128))& (v2153(VarNext,bitIndex258)<->v5007(VarNext,bitIndex127))& (v2153(VarNext,bitIndex257)<->v5007(VarNext,bitIndex126))& (v2153(VarNext,bitIndex256)<->v5007(VarNext,bitIndex125))& (v2153(VarNext,bitIndex255)<->v5007(VarNext,bitIndex124))).
% 121.33/120.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5008(VarNext)-> (v5007(VarNext,bitIndex130)<->v2153(VarCurr,bitIndex261))& (v5007(VarNext,bitIndex129)<->v2153(VarCurr,bitIndex260))& (v5007(VarNext,bitIndex128)<->v2153(VarCurr,bitIndex259))& (v5007(VarNext,bitIndex127)<->v2153(VarCurr,bitIndex258))& (v5007(VarNext,bitIndex126)<->v2153(VarCurr,bitIndex257))& (v5007(VarNext,bitIndex125)<->v2153(VarCurr,bitIndex256))& (v5007(VarNext,bitIndex124)<->v2153(VarCurr,bitIndex255))& (v5007(VarNext,bitIndex123)<->v2153(VarCurr,bitIndex254))& (v5007(VarNext,bitIndex122)<->v2153(VarCurr,bitIndex253))& (v5007(VarNext,bitIndex121)<->v2153(VarCurr,bitIndex252))& (v5007(VarNext,bitIndex120)<->v2153(VarCurr,bitIndex251))& (v5007(VarNext,bitIndex119)<->v2153(VarCurr,bitIndex250))& (v5007(VarNext,bitIndex118)<->v2153(VarCurr,bitIndex249))& (v5007(VarNext,bitIndex117)<->v2153(VarCurr,bitIndex248))& (v5007(VarNext,bitIndex116)<->v2153(VarCurr,bitIndex247))& (v5007(VarNext,bitIndex115)<->v2153(VarCurr,bitIndex246))& (v5007(VarNext,bitIndex114)<->v2153(VarCurr,bitIndex245))& (v5007(VarNext,bitIndex113)<->v2153(VarCurr,bitIndex244))& (v5007(VarNext,bitIndex112)<->v2153(VarCurr,bitIndex243))& (v5007(VarNext,bitIndex111)<->v2153(VarCurr,bitIndex242))& (v5007(VarNext,bitIndex110)<->v2153(VarCurr,bitIndex241))& (v5007(VarNext,bitIndex109)<->v2153(VarCurr,bitIndex240))& (v5007(VarNext,bitIndex108)<->v2153(VarCurr,bitIndex239))& (v5007(VarNext,bitIndex107)<->v2153(VarCurr,bitIndex238))& (v5007(VarNext,bitIndex106)<->v2153(VarCurr,bitIndex237))& (v5007(VarNext,bitIndex105)<->v2153(VarCurr,bitIndex236))& (v5007(VarNext,bitIndex104)<->v2153(VarCurr,bitIndex235))& (v5007(VarNext,bitIndex103)<->v2153(VarCurr,bitIndex234))& (v5007(VarNext,bitIndex102)<->v2153(VarCurr,bitIndex233))& (v5007(VarNext,bitIndex101)<->v2153(VarCurr,bitIndex232))& (v5007(VarNext,bitIndex100)<->v2153(VarCurr,bitIndex231))& (v5007(VarNext,bitIndex99)<->v2153(VarCurr,bitIndex230))& (v5007(VarNext,bitIndex98)<->v2153(VarCurr,bitIndex229))& (v5007(VarNext,bitIndex97)<->v2153(VarCurr,bitIndex228))& (v5007(VarNext,bitIndex96)<->v2153(VarCurr,bitIndex227))& (v5007(VarNext,bitIndex95)<->v2153(VarCurr,bitIndex226))& (v5007(VarNext,bitIndex94)<->v2153(VarCurr,bitIndex225))& (v5007(VarNext,bitIndex93)<->v2153(VarCurr,bitIndex224))& (v5007(VarNext,bitIndex92)<->v2153(VarCurr,bitIndex223))& (v5007(VarNext,bitIndex91)<->v2153(VarCurr,bitIndex222))& (v5007(VarNext,bitIndex90)<->v2153(VarCurr,bitIndex221))& (v5007(VarNext,bitIndex89)<->v2153(VarCurr,bitIndex220))& (v5007(VarNext,bitIndex88)<->v2153(VarCurr,bitIndex219))& (v5007(VarNext,bitIndex87)<->v2153(VarCurr,bitIndex218))& (v5007(VarNext,bitIndex86)<->v2153(VarCurr,bitIndex217))& (v5007(VarNext,bitIndex85)<->v2153(VarCurr,bitIndex216))& (v5007(VarNext,bitIndex84)<->v2153(VarCurr,bitIndex215))& (v5007(VarNext,bitIndex83)<->v2153(VarCurr,bitIndex214))& (v5007(VarNext,bitIndex82)<->v2153(VarCurr,bitIndex213))& (v5007(VarNext,bitIndex81)<->v2153(VarCurr,bitIndex212))& (v5007(VarNext,bitIndex80)<->v2153(VarCurr,bitIndex211))& (v5007(VarNext,bitIndex79)<->v2153(VarCurr,bitIndex210))& (v5007(VarNext,bitIndex78)<->v2153(VarCurr,bitIndex209))& (v5007(VarNext,bitIndex77)<->v2153(VarCurr,bitIndex208))& (v5007(VarNext,bitIndex76)<->v2153(VarCurr,bitIndex207))& (v5007(VarNext,bitIndex75)<->v2153(VarCurr,bitIndex206))& (v5007(VarNext,bitIndex74)<->v2153(VarCurr,bitIndex205))& (v5007(VarNext,bitIndex73)<->v2153(VarCurr,bitIndex204))& (v5007(VarNext,bitIndex72)<->v2153(VarCurr,bitIndex203))& (v5007(VarNext,bitIndex71)<->v2153(VarCurr,bitIndex202))& (v5007(VarNext,bitIndex70)<->v2153(VarCurr,bitIndex201))& (v5007(VarNext,bitIndex69)<->v2153(VarCurr,bitIndex200))& (v5007(VarNext,bitIndex68)<->v2153(VarCurr,bitIndex199))& (v5007(VarNext,bitIndex67)<->v2153(VarCurr,bitIndex198))& (v5007(VarNext,bitIndex66)<->v2153(VarCurr,bitIndex197))& (v5007(VarNext,bitIndex65)<->v2153(VarCurr,bitIndex196))& (v5007(VarNext,bitIndex64)<->v2153(VarCurr,bitIndex195))& (v5007(VarNext,bitIndex63)<->v2153(VarCurr,bitIndex194))& (v5007(VarNext,bitIndex62)<->v2153(VarCurr,bitIndex193))& (v5007(VarNext,bitIndex61)<->v2153(VarCurr,bitIndex192))& (v5007(VarNext,bitIndex60)<->v2153(VarCurr,bitIndex191))& (v5007(VarNext,bitIndex59)<->v2153(VarCurr,bitIndex190))& (v5007(VarNext,bitIndex58)<->v2153(VarCurr,bitIndex189))& (v5007(VarNext,bitIndex57)<->v2153(VarCurr,bitIndex188))& (v5007(VarNext,bitIndex56)<->v2153(VarCurr,bitIndex187))& (v5007(VarNext,bitIndex55)<->v2153(VarCurr,bitIndex186))& (v5007(VarNext,bitIndex54)<->v2153(VarCurr,bitIndex185))& (v5007(VarNext,bitIndex53)<->v2153(VarCurr,bitIndex184))& (v5007(VarNext,bitIndex52)<->v2153(VarCurr,bitIndex183))& (v5007(VarNext,bitIndex51)<->v2153(VarCurr,bitIndex182))& (v5007(VarNext,bitIndex50)<->v2153(VarCurr,bitIndex181))& (v5007(VarNext,bitIndex49)<->v2153(VarCurr,bitIndex180))& (v5007(VarNext,bitIndex48)<->v2153(VarCurr,bitIndex179))& (v5007(VarNext,bitIndex47)<->v2153(VarCurr,bitIndex178))& (v5007(VarNext,bitIndex46)<->v2153(VarCurr,bitIndex177))& (v5007(VarNext,bitIndex45)<->v2153(VarCurr,bitIndex176))& (v5007(VarNext,bitIndex44)<->v2153(VarCurr,bitIndex175))& (v5007(VarNext,bitIndex43)<->v2153(VarCurr,bitIndex174))& (v5007(VarNext,bitIndex42)<->v2153(VarCurr,bitIndex173))& (v5007(VarNext,bitIndex41)<->v2153(VarCurr,bitIndex172))& (v5007(VarNext,bitIndex40)<->v2153(VarCurr,bitIndex171))& (v5007(VarNext,bitIndex39)<->v2153(VarCurr,bitIndex170))& (v5007(VarNext,bitIndex38)<->v2153(VarCurr,bitIndex169))& (v5007(VarNext,bitIndex37)<->v2153(VarCurr,bitIndex168))& (v5007(VarNext,bitIndex36)<->v2153(VarCurr,bitIndex167))& (v5007(VarNext,bitIndex35)<->v2153(VarCurr,bitIndex166))& (v5007(VarNext,bitIndex34)<->v2153(VarCurr,bitIndex165))& (v5007(VarNext,bitIndex33)<->v2153(VarCurr,bitIndex164))& (v5007(VarNext,bitIndex32)<->v2153(VarCurr,bitIndex163))& (v5007(VarNext,bitIndex31)<->v2153(VarCurr,bitIndex162))& (v5007(VarNext,bitIndex30)<->v2153(VarCurr,bitIndex161))& (v5007(VarNext,bitIndex29)<->v2153(VarCurr,bitIndex160))& (v5007(VarNext,bitIndex28)<->v2153(VarCurr,bitIndex159))& (v5007(VarNext,bitIndex27)<->v2153(VarCurr,bitIndex158))& (v5007(VarNext,bitIndex26)<->v2153(VarCurr,bitIndex157))& (v5007(VarNext,bitIndex25)<->v2153(VarCurr,bitIndex156))& (v5007(VarNext,bitIndex24)<->v2153(VarCurr,bitIndex155))& (v5007(VarNext,bitIndex23)<->v2153(VarCurr,bitIndex154))& (v5007(VarNext,bitIndex22)<->v2153(VarCurr,bitIndex153))& (v5007(VarNext,bitIndex21)<->v2153(VarCurr,bitIndex152))& (v5007(VarNext,bitIndex20)<->v2153(VarCurr,bitIndex151))& (v5007(VarNext,bitIndex19)<->v2153(VarCurr,bitIndex150))& (v5007(VarNext,bitIndex18)<->v2153(VarCurr,bitIndex149))& (v5007(VarNext,bitIndex17)<->v2153(VarCurr,bitIndex148))& (v5007(VarNext,bitIndex16)<->v2153(VarCurr,bitIndex147))& (v5007(VarNext,bitIndex15)<->v2153(VarCurr,bitIndex146))& (v5007(VarNext,bitIndex14)<->v2153(VarCurr,bitIndex145))& (v5007(VarNext,bitIndex13)<->v2153(VarCurr,bitIndex144))& (v5007(VarNext,bitIndex12)<->v2153(VarCurr,bitIndex143))& (v5007(VarNext,bitIndex11)<->v2153(VarCurr,bitIndex142))& (v5007(VarNext,bitIndex10)<->v2153(VarCurr,bitIndex141))& (v5007(VarNext,bitIndex9)<->v2153(VarCurr,bitIndex140))& (v5007(VarNext,bitIndex8)<->v2153(VarCurr,bitIndex139))& (v5007(VarNext,bitIndex7)<->v2153(VarCurr,bitIndex138))& (v5007(VarNext,bitIndex6)<->v2153(VarCurr,bitIndex137))& (v5007(VarNext,bitIndex5)<->v2153(VarCurr,bitIndex136))& (v5007(VarNext,bitIndex4)<->v2153(VarCurr,bitIndex135))& (v5007(VarNext,bitIndex3)<->v2153(VarCurr,bitIndex134))& (v5007(VarNext,bitIndex2)<->v2153(VarCurr,bitIndex133))& (v5007(VarNext,bitIndex1)<->v2153(VarCurr,bitIndex132))& (v5007(VarNext,bitIndex0)<->v2153(VarCurr,bitIndex131)))).
% 121.33/120.31  all VarNext (v5008(VarNext)-> (all B (range_130_0(B)-> (v5007(VarNext,B)<->v5034(VarNext,B))))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v5034(VarNext,B)<->v5032(VarCurr,B))))).
% 121.33/120.31  all VarCurr (-v2275(VarCurr)-> (all B (range_130_0(B)-> (v5032(VarCurr,B)<->v5035(VarCurr,B))))).
% 121.33/120.31  all VarCurr (v2275(VarCurr)-> (all B (range_130_0(B)-> (v5032(VarCurr,B)<->$F)))).
% 121.33/120.31  all VarCurr (-v5021(VarCurr)& -v5023(VarCurr)-> (all B (range_130_0(B)-> (v5035(VarCurr,B)<->v5000(VarCurr,B))))).
% 121.33/120.31  all VarCurr (v5023(VarCurr)-> (all B (range_130_0(B)-> (v5035(VarCurr,B)<->v4993(VarCurr,B))))).
% 121.33/120.31  all VarCurr (v5021(VarCurr)-> (all B (range_130_0(B)-> (v5035(VarCurr,B)<->v2153(VarCurr,B))))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5008(VarNext)<->v5009(VarNext)&v5016(VarNext))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5016(VarNext)<->v5014(VarCurr))).
% 121.33/120.31  all VarCurr (v5014(VarCurr)<->v5017(VarCurr)&v5028(VarCurr)).
% 121.33/120.31  all VarCurr (v5028(VarCurr)<->v5029(VarCurr)|v2275(VarCurr)).
% 121.33/120.31  all VarCurr (-v5029(VarCurr)<->v5030(VarCurr)).
% 121.33/120.31  all VarCurr (v5030(VarCurr)<-> (v5031(VarCurr,bitIndex1)<->$F)& (v5031(VarCurr,bitIndex0)<->$F)).
% 121.33/120.31  all VarCurr (v5031(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.31  all VarCurr (v5031(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.31  all VarCurr (v5017(VarCurr)<->v2275(VarCurr)|v5018(VarCurr)).
% 121.33/120.31  all VarCurr (v5018(VarCurr)<->v5019(VarCurr)&v5027(VarCurr)).
% 121.33/120.31  all VarCurr (-v5027(VarCurr)<->v2275(VarCurr)).
% 121.33/120.31  all VarCurr (v5019(VarCurr)<->v5020(VarCurr)|v5025(VarCurr)).
% 121.33/120.31  all VarCurr (v5025(VarCurr)<-> (v5026(VarCurr,bitIndex1)<->$T)& (v5026(VarCurr,bitIndex0)<->$T)).
% 121.33/120.31  all VarCurr (v5026(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.31  all VarCurr (v5026(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.31  all VarCurr (v5020(VarCurr)<->v5021(VarCurr)|v5023(VarCurr)).
% 121.33/120.31  all VarCurr (v5023(VarCurr)<-> (v5024(VarCurr,bitIndex1)<->$T)& (v5024(VarCurr,bitIndex0)<->$F)).
% 121.33/120.31  all VarCurr (v5024(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.31  all VarCurr (v5024(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.31  all VarCurr (v5021(VarCurr)<-> (v5022(VarCurr,bitIndex1)<->$F)& (v5022(VarCurr,bitIndex0)<->$T)).
% 121.33/120.31  all VarCurr (v5022(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.31  all VarCurr (v5022(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v5009(VarNext)<->v5011(VarNext)&v2173(VarNext))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v5011(VarNext)<->v2182(VarNext))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v5000(VarCurr,B)<->v5005(VarCurr,B))).
% 121.33/120.31  all VarCurr (-v5002(VarCurr)-> (all B (range_130_0(B)-> (v5005(VarCurr,B)<->v5004(VarCurr,B))))).
% 121.33/120.31  all VarCurr (v5002(VarCurr)-> (all B (range_130_0(B)-> (v5005(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v5004(VarCurr,B)<->v2153(VarCurr,B))).
% 121.33/120.31  all VarCurr (v5002(VarCurr)<->v2167(VarCurr,bitIndex3)).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v4993(VarCurr,B)<->v4998(VarCurr,B))).
% 121.33/120.31  all VarCurr (-v4995(VarCurr)-> (all B (range_130_0(B)-> (v4998(VarCurr,B)<->v4997(VarCurr,B))))).
% 121.33/120.31  all VarCurr (v4995(VarCurr)-> (all B (range_130_0(B)-> (v4998(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.33/120.31  all VarCurr ((v4997(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex261))& (v4997(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex260))& (v4997(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex259))& (v4997(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex258))& (v4997(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex257))& (v4997(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex256))& (v4997(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex255))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v2234(VarCurr,B)<->v2236(VarCurr,B))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v2236(VarCurr,B)<->v2238(VarCurr,B))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v2238(VarCurr,B)<->v2240(VarCurr,B))).
% 121.33/120.31  all VarCurr B (range_130_124(B)-> (v2240(VarCurr,B)<->v2243(VarCurr,B))).
% 121.33/120.31  all B (range_130_124(B)<->bitIndex124=B|bitIndex125=B|bitIndex126=B|bitIndex127=B|bitIndex128=B|bitIndex129=B|bitIndex130=B).
% 121.33/120.31  all VarCurr (v4995(VarCurr)<->v2167(VarCurr,bitIndex3)).
% 121.33/120.31  all VarCurr (v23(VarCurr)<->v25(VarCurr)).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4969(VarNext)-> (v25(VarNext)<->v25(VarCurr)))).
% 121.33/120.31  all VarNext (v4969(VarNext)-> (v25(VarNext)<->v4987(VarNext))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4987(VarNext)<->v4985(VarCurr))).
% 121.33/120.31  all VarCurr (-v4984(VarCurr)-> (v4985(VarCurr)<->v4988(VarCurr))).
% 121.33/120.31  all VarCurr (v4984(VarCurr)-> (v4985(VarCurr)<->$T)).
% 121.33/120.31  all VarCurr (-v29(VarCurr)-> (v4988(VarCurr)<->$T)).
% 121.33/120.31  all VarCurr (v29(VarCurr)-> (v4988(VarCurr)<->$F)).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4969(VarNext)<->v4970(VarNext)&v4977(VarNext))).
% 121.33/120.31  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4977(VarNext)<->v4975(VarCurr))).
% 121.33/120.31  all VarCurr (v4975(VarCurr)<->v4978(VarCurr)|v4984(VarCurr)).
% 121.33/120.32  all VarCurr (-v4984(VarCurr)<->v27(VarCurr)).
% 121.33/120.32  all VarCurr (v4978(VarCurr)<->v4979(VarCurr)|v29(VarCurr)).
% 121.33/120.32  all VarCurr (v4979(VarCurr)<->v4980(VarCurr)&v4983(VarCurr)).
% 121.33/120.32  all VarCurr (v4983(VarCurr)<-> (v2169(VarCurr,bitIndex0)<->$T)).
% 121.33/120.32  all VarCurr (v4980(VarCurr)<->v4981(VarCurr)&v4982(VarCurr)).
% 121.33/120.32  all VarCurr (v4982(VarCurr)<-> (v4917(VarCurr,bitIndex1)<->$F)).
% 121.33/120.32  all VarCurr (v4981(VarCurr)<-> (v2155(VarCurr)<->$T)).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4970(VarNext)<->v4971(VarNext)&v2173(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4971(VarNext)<->v2182(VarNext))).
% 121.33/120.32  all VarCurr (v2169(VarCurr,bitIndex0)<->v2192(VarCurr,bitIndex0)).
% 121.33/120.32  all VarNext (v4917(VarNext,bitIndex1)<->v4960(VarNext,bitIndex1)).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4961(VarNext)-> (all B (range_3_0(B)-> (v4960(VarNext,B)<->v4917(VarCurr,B)))))).
% 121.33/120.32  all VarNext (v4961(VarNext)-> (all B (range_3_0(B)-> (v4960(VarNext,B)<->v4948(VarNext,B))))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4961(VarNext)<->v4962(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4962(VarNext)<->v4964(VarNext)&v2173(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4964(VarNext)<->v2182(VarNext))).
% 121.33/120.32  all VarCurr (v4919(VarCurr,bitIndex1)<->v4924(VarCurr,bitIndex1)).
% 121.33/120.32  all VarCurr (v4921(VarCurr,bitIndex1)<->v4922(VarCurr,bitIndex1)).
% 121.33/120.32  all VarNext (v4917(VarNext,bitIndex0)<->v4952(VarNext,bitIndex0)).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4953(VarNext)-> (all B (range_3_0(B)-> (v4952(VarNext,B)<->v4917(VarCurr,B)))))).
% 121.33/120.32  all VarNext (v4953(VarNext)-> (all B (range_3_0(B)-> (v4952(VarNext,B)<->v4948(VarNext,B))))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4953(VarNext)<->v4954(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4954(VarNext)<->v4956(VarNext)&v2173(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4956(VarNext)<->v2182(VarNext))).
% 121.33/120.32  all VarCurr (v4919(VarCurr,bitIndex0)<->v4924(VarCurr,bitIndex0)).
% 121.33/120.32  all VarNext (v4917(VarNext,bitIndex2)<->v4939(VarNext,bitIndex2)).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4940(VarNext)-> (all B (range_3_0(B)-> (v4939(VarNext,B)<->v4917(VarCurr,B)))))).
% 121.33/120.32  all VarNext (v4940(VarNext)-> (all B (range_3_0(B)-> (v4939(VarNext,B)<->v4948(VarNext,B))))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v4948(VarNext,B)<->v4946(VarCurr,B))))).
% 121.33/120.32  all VarCurr (-v2189(VarCurr)-> (all B (range_3_0(B)-> (v4946(VarCurr,B)<->v4919(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v2189(VarCurr)-> (all B (range_3_0(B)-> (v4946(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4940(VarNext)<->v4941(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4941(VarNext)<->v4943(VarNext)&v2173(VarNext))).
% 121.33/120.32  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4943(VarNext)<->v2182(VarNext))).
% 121.33/120.32  all VarCurr (v4919(VarCurr,bitIndex2)<->v4924(VarCurr,bitIndex2)).
% 121.33/120.32  all VarCurr (-v4925(VarCurr)-> (all B (range_3_0(B)-> (v4924(VarCurr,B)<->v4926(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4925(VarCurr)-> (all B (range_3_0(B)-> (v4924(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarCurr (-v4927(VarCurr)& -v4929(VarCurr)& -v4933(VarCurr)-> (all B (range_3_0(B)-> (v4926(VarCurr,B)<->v4917(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4933(VarCurr)-> (all B (range_3_0(B)-> (v4926(VarCurr,B)<->v4935(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4929(VarCurr)-> (all B (range_3_0(B)-> (v4926(VarCurr,B)<->v4931(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4927(VarCurr)-> (all B (range_3_0(B)-> (v4926(VarCurr,B)<->v4917(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4936(VarCurr)<-> (v4937(VarCurr,bitIndex1)<->$T)& (v4937(VarCurr,bitIndex0)<->$T)).
% 121.33/120.32  all VarCurr (v4937(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.32  all VarCurr (v4937(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.32  all VarCurr (v4935(VarCurr,bitIndex0)<->$T).
% 121.33/120.32  all VarCurr B (range_3_1(B)-> (v4935(VarCurr,B)<->v4921(VarCurr,B))).
% 121.33/120.32  all VarCurr (v4933(VarCurr)<-> (v4934(VarCurr,bitIndex1)<->$T)& (v4934(VarCurr,bitIndex0)<->$F)).
% 121.33/120.32  all VarCurr (v4934(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.32  all VarCurr (v4934(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.32  all VarCurr ((v4931(VarCurr,bitIndex2)<->v4917(VarCurr,bitIndex3))& (v4931(VarCurr,bitIndex1)<->v4917(VarCurr,bitIndex2))& (v4931(VarCurr,bitIndex0)<->v4917(VarCurr,bitIndex1))).
% 121.33/120.32  all VarCurr (v4931(VarCurr,bitIndex3)<->$F).
% 121.33/120.32  all VarCurr (v4929(VarCurr)<-> (v4930(VarCurr,bitIndex1)<->$F)& (v4930(VarCurr,bitIndex0)<->$T)).
% 121.33/120.32  all VarCurr (v4930(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.32  all VarCurr (v4930(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.32  all VarCurr (v4927(VarCurr)<-> (v4928(VarCurr,bitIndex1)<->$F)& (v4928(VarCurr,bitIndex0)<->$F)).
% 121.33/120.32  all VarCurr (v4928(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.33/120.32  all VarCurr (v4928(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.33/120.32  all VarCurr (-v4925(VarCurr)<->v27(VarCurr)).
% 121.33/120.32  all VarCurr (v4921(VarCurr,bitIndex2)<->v4922(VarCurr,bitIndex2)).
% 121.33/120.32  all VarCurr (v4922(VarCurr,bitIndex0)<->$F).
% 121.33/120.32  all VarCurr ((v4922(VarCurr,bitIndex3)<->v4917(VarCurr,bitIndex2))& (v4922(VarCurr,bitIndex2)<->v4917(VarCurr,bitIndex1))& (v4922(VarCurr,bitIndex1)<->v4917(VarCurr,bitIndex0))).
% 121.33/120.32  all B (range_3_0(B)-> (v4917(constB0,B)<->$F)).
% 121.33/120.32  all VarCurr (v29(VarCurr)<->v31(VarCurr)).
% 121.33/120.32  all VarCurr (v31(VarCurr)<->v33(VarCurr)).
% 121.33/120.32  all VarCurr (v33(VarCurr)<->v35(VarCurr)).
% 121.33/120.32  all VarCurr (-v4909(VarCurr)-> (v35(VarCurr)<->$F)).
% 121.33/120.32  all VarCurr (v4909(VarCurr)-> (v35(VarCurr)<->v4914(VarCurr))).
% 121.33/120.32  all VarCurr (-v4911(VarCurr)-> (v4914(VarCurr)<->$F)).
% 121.33/120.32  all VarCurr (v4911(VarCurr)-> (v4914(VarCurr)<->v4915(VarCurr))).
% 121.33/120.32  all VarCurr (-v4421(VarCurr)-> (v4915(VarCurr)<->$F)).
% 121.33/120.32  all VarCurr (v4421(VarCurr)-> (v4915(VarCurr)<->$T)).
% 121.33/120.32  all VarCurr (v4909(VarCurr)<->v4910(VarCurr)&v4913(VarCurr)).
% 121.33/120.32  all VarCurr (-v4913(VarCurr)<->v4274(VarCurr)).
% 121.33/120.32  all VarCurr (v4910(VarCurr)<->v4911(VarCurr)|v4912(VarCurr)).
% 121.33/120.32  all VarCurr (-v4912(VarCurr)<->v4271(VarCurr)).
% 121.33/120.32  all VarCurr (v4911(VarCurr)<->v4486(VarCurr)&v4271(VarCurr)).
% 121.33/120.32  all VarCurr (-v39(VarCurr)-> (all B (range_1_0(B)-> (v37(VarCurr,B)<->v4839(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v39(VarCurr)-> (all B (range_1_0(B)-> (v37(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarCurr (-v4840(VarCurr)& -v4877(VarCurr)& -v4885(VarCurr)& -v4893(VarCurr)-> (all B (range_1_0(B)-> (v4839(VarCurr,B)<->$T)))).
% 121.33/120.32  all VarCurr (v4893(VarCurr)-> (all B (range_1_0(B)-> (v4839(VarCurr,B)<->b01(B))))).
% 121.33/120.32  all VarCurr (v4885(VarCurr)-> (all B (range_1_0(B)-> (v4839(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarCurr (v4877(VarCurr)-> (all B (range_1_0(B)-> (v4839(VarCurr,B)<->v4884(VarCurr,B))))).
% 121.33/120.32  all VarCurr (v4840(VarCurr)-> (all B (range_1_0(B)-> (v4839(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarCurr (v4893(VarCurr)<->v4895(VarCurr)|v4904(VarCurr)).
% 121.33/120.32  all VarCurr (v4904(VarCurr)<->v4906(VarCurr)&v4859(VarCurr)).
% 121.33/120.32  all VarCurr (v4906(VarCurr)<->v4907(VarCurr)&v4564(VarCurr,bitIndex5)).
% 121.33/120.32  all VarCurr (v4907(VarCurr)<->v4856(VarCurr)&v4564(VarCurr,bitIndex4)).
% 121.33/120.32  all VarCurr (v4895(VarCurr)<->v4896(VarCurr)|v4901(VarCurr)).
% 121.33/120.32  all VarCurr (v4901(VarCurr)<->v4903(VarCurr)&v4564(VarCurr,bitIndex6)).
% 121.33/120.32  all VarCurr (v4903(VarCurr)<->v4899(VarCurr)&v4564(VarCurr,bitIndex5)).
% 121.33/120.32  all VarCurr (v4896(VarCurr)<->v4898(VarCurr)&v4564(VarCurr,bitIndex6)).
% 121.33/120.32  all VarCurr (v4898(VarCurr)<->v4899(VarCurr)&v4858(VarCurr)).
% 121.33/120.32  all VarCurr (v4899(VarCurr)<->v4900(VarCurr)&v4564(VarCurr,bitIndex4)).
% 121.33/120.32  all VarCurr (v4900(VarCurr)<->v4851(VarCurr)&v4564(VarCurr,bitIndex3)).
% 121.33/120.32  all VarCurr (v4885(VarCurr)<->v4886(VarCurr)|v4888(VarCurr)).
% 121.33/120.32  all VarCurr (v4888(VarCurr)<->v4890(VarCurr)&v4859(VarCurr)).
% 121.33/120.32  all VarCurr (v4890(VarCurr)<->v4891(VarCurr)&v4858(VarCurr)).
% 121.33/120.32  all VarCurr (v4891(VarCurr)<->v4892(VarCurr)&v4857(VarCurr)).
% 121.33/120.32  all VarCurr (v4892(VarCurr)<->v4868(VarCurr)&v4564(VarCurr,bitIndex3)).
% 121.33/120.32  all VarCurr (v4886(VarCurr)<->v4879(VarCurr)&v4564(VarCurr,bitIndex6)).
% 121.33/120.32  all VarCurr (-v4777(VarCurr)-> (all B (range_1_0(B)-> (v4884(VarCurr,B)<->$F)))).
% 121.33/120.32  all VarCurr (v4777(VarCurr)-> (all B (range_1_0(B)-> (v4884(VarCurr,B)<->b10(B))))).
% 121.33/120.32  all VarCurr (v4877(VarCurr)<->v4879(VarCurr)&v4859(VarCurr)).
% 121.33/120.32  all VarCurr (v4879(VarCurr)<->v4880(VarCurr)&v4858(VarCurr)).
% 121.33/120.32  all VarCurr (v4880(VarCurr)<->v4881(VarCurr)&v4857(VarCurr)).
% 121.33/120.32  all VarCurr (v4881(VarCurr)<->v4882(VarCurr)&v4564(VarCurr,bitIndex3)).
% 121.33/120.32  all VarCurr (v4882(VarCurr)<->v4883(VarCurr)&v4855(VarCurr)).
% 121.33/120.32  all VarCurr (v4883(VarCurr)<->v4853(VarCurr)&v4564(VarCurr,bitIndex1)).
% 121.33/120.33  all VarCurr (v4840(VarCurr)<->v4842(VarCurr)|v4875(VarCurr)).
% 121.33/120.33  all VarCurr (v4875(VarCurr)<->v4862(VarCurr)&v4564(VarCurr,bitIndex6)).
% 121.33/120.33  all VarCurr (v4842(VarCurr)<->v4843(VarCurr)|v4873(VarCurr)).
% 121.33/120.33  all VarCurr (v4873(VarCurr)<->v4848(VarCurr)&v4564(VarCurr,bitIndex6)).
% 121.33/120.33  all VarCurr (v4843(VarCurr)<->v4844(VarCurr)|v4870(VarCurr)).
% 121.33/120.33  all VarCurr (v4870(VarCurr)<->v4872(VarCurr)&v4859(VarCurr)).
% 121.33/120.33  all VarCurr (v4872(VarCurr)<->v4866(VarCurr)&v4564(VarCurr,bitIndex5)).
% 121.33/120.33  all VarCurr (v4844(VarCurr)<->v4845(VarCurr)|v4863(VarCurr)).
% 121.33/120.33  all VarCurr (v4863(VarCurr)<->v4865(VarCurr)&v4859(VarCurr)).
% 121.33/120.33  all VarCurr (v4865(VarCurr)<->v4866(VarCurr)&v4858(VarCurr)).
% 121.33/120.33  all VarCurr (v4866(VarCurr)<->v4867(VarCurr)&v4857(VarCurr)).
% 121.33/120.33  all VarCurr (v4867(VarCurr)<->v4868(VarCurr)&v4856(VarCurr)).
% 121.33/120.33  all VarCurr (v4868(VarCurr)<->v4869(VarCurr)&v4855(VarCurr)).
% 121.33/120.33  all VarCurr (v4869(VarCurr)<->v4564(VarCurr,bitIndex0)&v4854(VarCurr)).
% 121.33/120.33  all VarCurr (v4845(VarCurr)<->v4846(VarCurr)|v4860(VarCurr)).
% 121.33/120.33  all VarCurr (v4860(VarCurr)<->v4862(VarCurr)&v4859(VarCurr)).
% 121.33/120.33  all VarCurr (v4862(VarCurr)<->v4849(VarCurr)&v4564(VarCurr,bitIndex5)).
% 121.33/120.33  all VarCurr (v4846(VarCurr)<->v4848(VarCurr)&v4859(VarCurr)).
% 121.33/120.33  all VarCurr (-v4859(VarCurr)<->v4564(VarCurr,bitIndex6)).
% 121.33/120.33  all VarCurr (v4848(VarCurr)<->v4849(VarCurr)&v4858(VarCurr)).
% 121.33/120.33  all VarCurr (-v4858(VarCurr)<->v4564(VarCurr,bitIndex5)).
% 121.33/120.33  all VarCurr (v4849(VarCurr)<->v4850(VarCurr)&v4857(VarCurr)).
% 121.33/120.33  all VarCurr (-v4857(VarCurr)<->v4564(VarCurr,bitIndex4)).
% 121.33/120.33  all VarCurr (v4850(VarCurr)<->v4851(VarCurr)&v4856(VarCurr)).
% 121.33/120.33  all VarCurr (-v4856(VarCurr)<->v4564(VarCurr,bitIndex3)).
% 121.33/120.33  all VarCurr (v4851(VarCurr)<->v4852(VarCurr)&v4855(VarCurr)).
% 121.33/120.33  all VarCurr (-v4855(VarCurr)<->v4564(VarCurr,bitIndex2)).
% 121.33/120.33  all VarCurr (v4852(VarCurr)<->v4853(VarCurr)&v4854(VarCurr)).
% 121.33/120.33  all VarCurr (-v4854(VarCurr)<->v4564(VarCurr,bitIndex1)).
% 121.33/120.33  all VarCurr (-v4853(VarCurr)<->v4564(VarCurr,bitIndex0)).
% 121.33/120.33  all VarCurr (v4777(VarCurr)<->v2244(VarCurr,bitIndex81)).
% 121.33/120.33  all VarCurr (v2244(VarCurr,bitIndex81)<->v4567(VarCurr,bitIndex81)).
% 121.33/120.33  all VarCurr (v4567(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex696)).
% 121.33/120.33  all VarNext (v4569(VarNext,bitIndex696)<->v4831(VarNext,bitIndex81)).
% 121.33/120.33  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4833(VarNext)-> (v4831(VarNext,bitIndex122)<->v4569(VarCurr,bitIndex737))& (v4831(VarNext,bitIndex121)<->v4569(VarCurr,bitIndex736))& (v4831(VarNext,bitIndex120)<->v4569(VarCurr,bitIndex735))& (v4831(VarNext,bitIndex119)<->v4569(VarCurr,bitIndex734))& (v4831(VarNext,bitIndex118)<->v4569(VarCurr,bitIndex733))& (v4831(VarNext,bitIndex117)<->v4569(VarCurr,bitIndex732))& (v4831(VarNext,bitIndex116)<->v4569(VarCurr,bitIndex731))& (v4831(VarNext,bitIndex115)<->v4569(VarCurr,bitIndex730))& (v4831(VarNext,bitIndex114)<->v4569(VarCurr,bitIndex729))& (v4831(VarNext,bitIndex113)<->v4569(VarCurr,bitIndex728))& (v4831(VarNext,bitIndex112)<->v4569(VarCurr,bitIndex727))& (v4831(VarNext,bitIndex111)<->v4569(VarCurr,bitIndex726))& (v4831(VarNext,bitIndex110)<->v4569(VarCurr,bitIndex725))& (v4831(VarNext,bitIndex109)<->v4569(VarCurr,bitIndex724))& (v4831(VarNext,bitIndex108)<->v4569(VarCurr,bitIndex723))& (v4831(VarNext,bitIndex107)<->v4569(VarCurr,bitIndex722))& (v4831(VarNext,bitIndex106)<->v4569(VarCurr,bitIndex721))& (v4831(VarNext,bitIndex105)<->v4569(VarCurr,bitIndex720))& (v4831(VarNext,bitIndex104)<->v4569(VarCurr,bitIndex719))& (v4831(VarNext,bitIndex103)<->v4569(VarCurr,bitIndex718))& (v4831(VarNext,bitIndex102)<->v4569(VarCurr,bitIndex717))& (v4831(VarNext,bitIndex101)<->v4569(VarCurr,bitIndex716))& (v4831(VarNext,bitIndex100)<->v4569(VarCurr,bitIndex715))& (v4831(VarNext,bitIndex99)<->v4569(VarCurr,bitIndex714))& (v4831(VarNext,bitIndex98)<->v4569(VarCurr,bitIndex713))& (v4831(VarNext,bitIndex97)<->v4569(VarCurr,bitIndex712))& (v4831(VarNext,bitIndex96)<->v4569(VarCurr,bitIndex711))& (v4831(VarNext,bitIndex95)<->v4569(VarCurr,bitIndex710))& (v4831(VarNext,bitIndex94)<->v4569(VarCurr,bitIndex709))& (v4831(VarNext,bitIndex93)<->v4569(VarCurr,bitIndex708))& (v4831(VarNext,bitIndex92)<->v4569(VarCurr,bitIndex707))& (v4831(VarNext,bitIndex91)<->v4569(VarCurr,bitIndex706))& (v4831(VarNext,bitIndex90)<->v4569(VarCurr,bitIndex705))& (v4831(VarNext,bitIndex89)<->v4569(VarCurr,bitIndex704))& (v4831(VarNext,bitIndex88)<->v4569(VarCurr,bitIndex703))& (v4831(VarNext,bitIndex87)<->v4569(VarCurr,bitIndex702))& (v4831(VarNext,bitIndex86)<->v4569(VarCurr,bitIndex701))& (v4831(VarNext,bitIndex85)<->v4569(VarCurr,bitIndex700))& (v4831(VarNext,bitIndex84)<->v4569(VarCurr,bitIndex699))& (v4831(VarNext,bitIndex83)<->v4569(VarCurr,bitIndex698))& (v4831(VarNext,bitIndex82)<->v4569(VarCurr,bitIndex697))& (v4831(VarNext,bitIndex81)<->v4569(VarCurr,bitIndex696))& (v4831(VarNext,bitIndex80)<->v4569(VarCurr,bitIndex695))& (v4831(VarNext,bitIndex79)<->v4569(VarCurr,bitIndex694))& (v4831(VarNext,bitIndex78)<->v4569(VarCurr,bitIndex693))& (v4831(VarNext,bitIndex77)<->v4569(VarCurr,bitIndex692))& (v4831(VarNext,bitIndex76)<->v4569(VarCurr,bitIndex691))& (v4831(VarNext,bitIndex75)<->v4569(VarCurr,bitIndex690))& (v4831(VarNext,bitIndex74)<->v4569(VarCurr,bitIndex689))& (v4831(VarNext,bitIndex73)<->v4569(VarCurr,bitIndex688))& (v4831(VarNext,bitIndex72)<->v4569(VarCurr,bitIndex687))& (v4831(VarNext,bitIndex71)<->v4569(VarCurr,bitIndex686))& (v4831(VarNext,bitIndex70)<->v4569(VarCurr,bitIndex685))& (v4831(VarNext,bitIndex69)<->v4569(VarCurr,bitIndex684))& (v4831(VarNext,bitIndex68)<->v4569(VarCurr,bitIndex683))& (v4831(VarNext,bitIndex67)<->v4569(VarCurr,bitIndex682))& (v4831(VarNext,bitIndex66)<->v4569(VarCurr,bitIndex681))& (v4831(VarNext,bitIndex65)<->v4569(VarCurr,bitIndex680))& (v4831(VarNext,bitIndex64)<->v4569(VarCurr,bitIndex679))& (v4831(VarNext,bitIndex63)<->v4569(VarCurr,bitIndex678))& (v4831(VarNext,bitIndex62)<->v4569(VarCurr,bitIndex677))& (v4831(VarNext,bitIndex61)<->v4569(VarCurr,bitIndex676))& (v4831(VarNext,bitIndex60)<->v4569(VarCurr,bitIndex675))& (v4831(VarNext,bitIndex59)<->v4569(VarCurr,bitIndex674))& (v4831(VarNext,bitIndex58)<->v4569(VarCurr,bitIndex673))& (v4831(VarNext,bitIndex57)<->v4569(VarCurr,bitIndex672))& (v4831(VarNext,bitIndex56)<->v4569(VarCurr,bitIndex671))& (v4831(VarNext,bitIndex55)<->v4569(VarCurr,bitIndex670))& (v4831(VarNext,bitIndex54)<->v4569(VarCurr,bitIndex669))& (v4831(VarNext,bitIndex53)<->v4569(VarCurr,bitIndex668))& (v4831(VarNext,bitIndex52)<->v4569(VarCurr,bitIndex667))& (v4831(VarNext,bitIndex51)<->v4569(VarCurr,bitIndex666))& (v4831(VarNext,bitIndex50)<->v4569(VarCurr,bitIndex665))& (v4831(VarNext,bitIndex49)<->v4569(VarCurr,bitIndex664))& (v4831(VarNext,bitIndex48)<->v4569(VarCurr,bitIndex663))& (v4831(VarNext,bitIndex47)<->v4569(VarCurr,bitIndex662))& (v4831(VarNext,bitIndex46)<->v4569(VarCurr,bitIndex661))& (v4831(VarNext,bitIndex45)<->v4569(VarCurr,bitIndex660))& (v4831(VarNext,bitIndex44)<->v4569(VarCurr,bitIndex659))& (v4831(VarNext,bitIndex43)<->v4569(VarCurr,bitIndex658))& (v4831(VarNext,bitIndex42)<->v4569(VarCurr,bitIndex657))& (v4831(VarNext,bitIndex41)<->v4569(VarCurr,bitIndex656))& (v4831(VarNext,bitIndex40)<->v4569(VarCurr,bitIndex655))& (v4831(VarNext,bitIndex39)<->v4569(VarCurr,bitIndex654))& (v4831(VarNext,bitIndex38)<->v4569(VarCurr,bitIndex653))& (v4831(VarNext,bitIndex37)<->v4569(VarCurr,bitIndex652))& (v4831(VarNext,bitIndex36)<->v4569(VarCurr,bitIndex651))& (v4831(VarNext,bitIndex35)<->v4569(VarCurr,bitIndex650))& (v4831(VarNext,bitIndex34)<->v4569(VarCurr,bitIndex649))& (v4831(VarNext,bitIndex33)<->v4569(VarCurr,bitIndex648))& (v4831(VarNext,bitIndex32)<->v4569(VarCurr,bitIndex647))& (v4831(VarNext,bitIndex31)<->v4569(VarCurr,bitIndex646))& (v4831(VarNext,bitIndex30)<->v4569(VarCurr,bitIndex645))& (v4831(VarNext,bitIndex29)<->v4569(VarCurr,bitIndex644))& (v4831(VarNext,bitIndex28)<->v4569(VarCurr,bitIndex643))& (v4831(VarNext,bitIndex27)<->v4569(VarCurr,bitIndex642))& (v4831(VarNext,bitIndex26)<->v4569(VarCurr,bitIndex641))& (v4831(VarNext,bitIndex25)<->v4569(VarCurr,bitIndex640))& (v4831(VarNext,bitIndex24)<->v4569(VarCurr,bitIndex639))& (v4831(VarNext,bitIndex23)<->v4569(VarCurr,bitIndex638))& (v4831(VarNext,bitIndex22)<->v4569(VarCurr,bitIndex637))& (v4831(VarNext,bitIndex21)<->v4569(VarCurr,bitIndex636))& (v4831(VarNext,bitIndex20)<->v4569(VarCurr,bitIndex635))& (v4831(VarNext,bitIndex19)<->v4569(VarCurr,bitIndex634))& (v4831(VarNext,bitIndex18)<->v4569(VarCurr,bitIndex633))& (v4831(VarNext,bitIndex17)<->v4569(VarCurr,bitIndex632))& (v4831(VarNext,bitIndex16)<->v4569(VarCurr,bitIndex631))& (v4831(VarNext,bitIndex15)<->v4569(VarCurr,bitIndex630))& (v4831(VarNext,bitIndex14)<->v4569(VarCurr,bitIndex629))& (v4831(VarNext,bitIndex13)<->v4569(VarCurr,bitIndex628))& (v4831(VarNext,bitIndex12)<->v4569(VarCurr,bitIndex627))& (v4831(VarNext,bitIndex11)<->v4569(VarCurr,bitIndex626))& (v4831(VarNext,bitIndex10)<->v4569(VarCurr,bitIndex625))& (v4831(VarNext,bitIndex9)<->v4569(VarCurr,bitIndex624))& (v4831(VarNext,bitIndex8)<->v4569(VarCurr,bitIndex623))& (v4831(VarNext,bitIndex7)<->v4569(VarCurr,bitIndex622))& (v4831(VarNext,bitIndex6)<->v4569(VarCurr,bitIndex621))& (v4831(VarNext,bitIndex5)<->v4569(VarCurr,bitIndex620))& (v4831(VarNext,bitIndex4)<->v4569(VarCurr,bitIndex619))& (v4831(VarNext,bitIndex3)<->v4569(VarCurr,bitIndex618))& (v4831(VarNext,bitIndex2)<->v4569(VarCurr,bitIndex617))& (v4831(VarNext,bitIndex1)<->v4569(VarCurr,bitIndex616))& (v4831(VarNext,bitIndex0)<->v4569(VarCurr,bitIndex615)))).
% 121.33/120.34  all VarNext (v4833(VarNext)-> (all B (range_122_0(B)-> (v4831(VarNext,B)<->v4771(VarNext,B))))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4833(VarNext)<->v4834(VarNext)&v4753(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4834(VarNext)<->v4836(VarNext)&v4515(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4836(VarNext)<->v4522(VarNext))).
% 121.33/120.34  all VarCurr (v4737(VarCurr,bitIndex81)<->v4742(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4741(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex573)).
% 121.33/120.34  all VarCurr (v4730(VarCurr,bitIndex81)<->v4735(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4734(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex696)).
% 121.33/120.34  all VarNext (v4569(VarNext,bitIndex573)<->v4823(VarNext,bitIndex81)).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4825(VarNext)-> (v4823(VarNext,bitIndex122)<->v4569(VarCurr,bitIndex614))& (v4823(VarNext,bitIndex121)<->v4569(VarCurr,bitIndex613))& (v4823(VarNext,bitIndex120)<->v4569(VarCurr,bitIndex612))& (v4823(VarNext,bitIndex119)<->v4569(VarCurr,bitIndex611))& (v4823(VarNext,bitIndex118)<->v4569(VarCurr,bitIndex610))& (v4823(VarNext,bitIndex117)<->v4569(VarCurr,bitIndex609))& (v4823(VarNext,bitIndex116)<->v4569(VarCurr,bitIndex608))& (v4823(VarNext,bitIndex115)<->v4569(VarCurr,bitIndex607))& (v4823(VarNext,bitIndex114)<->v4569(VarCurr,bitIndex606))& (v4823(VarNext,bitIndex113)<->v4569(VarCurr,bitIndex605))& (v4823(VarNext,bitIndex112)<->v4569(VarCurr,bitIndex604))& (v4823(VarNext,bitIndex111)<->v4569(VarCurr,bitIndex603))& (v4823(VarNext,bitIndex110)<->v4569(VarCurr,bitIndex602))& (v4823(VarNext,bitIndex109)<->v4569(VarCurr,bitIndex601))& (v4823(VarNext,bitIndex108)<->v4569(VarCurr,bitIndex600))& (v4823(VarNext,bitIndex107)<->v4569(VarCurr,bitIndex599))& (v4823(VarNext,bitIndex106)<->v4569(VarCurr,bitIndex598))& (v4823(VarNext,bitIndex105)<->v4569(VarCurr,bitIndex597))& (v4823(VarNext,bitIndex104)<->v4569(VarCurr,bitIndex596))& (v4823(VarNext,bitIndex103)<->v4569(VarCurr,bitIndex595))& (v4823(VarNext,bitIndex102)<->v4569(VarCurr,bitIndex594))& (v4823(VarNext,bitIndex101)<->v4569(VarCurr,bitIndex593))& (v4823(VarNext,bitIndex100)<->v4569(VarCurr,bitIndex592))& (v4823(VarNext,bitIndex99)<->v4569(VarCurr,bitIndex591))& (v4823(VarNext,bitIndex98)<->v4569(VarCurr,bitIndex590))& (v4823(VarNext,bitIndex97)<->v4569(VarCurr,bitIndex589))& (v4823(VarNext,bitIndex96)<->v4569(VarCurr,bitIndex588))& (v4823(VarNext,bitIndex95)<->v4569(VarCurr,bitIndex587))& (v4823(VarNext,bitIndex94)<->v4569(VarCurr,bitIndex586))& (v4823(VarNext,bitIndex93)<->v4569(VarCurr,bitIndex585))& (v4823(VarNext,bitIndex92)<->v4569(VarCurr,bitIndex584))& (v4823(VarNext,bitIndex91)<->v4569(VarCurr,bitIndex583))& (v4823(VarNext,bitIndex90)<->v4569(VarCurr,bitIndex582))& (v4823(VarNext,bitIndex89)<->v4569(VarCurr,bitIndex581))& (v4823(VarNext,bitIndex88)<->v4569(VarCurr,bitIndex580))& (v4823(VarNext,bitIndex87)<->v4569(VarCurr,bitIndex579))& (v4823(VarNext,bitIndex86)<->v4569(VarCurr,bitIndex578))& (v4823(VarNext,bitIndex85)<->v4569(VarCurr,bitIndex577))& (v4823(VarNext,bitIndex84)<->v4569(VarCurr,bitIndex576))& (v4823(VarNext,bitIndex83)<->v4569(VarCurr,bitIndex575))& (v4823(VarNext,bitIndex82)<->v4569(VarCurr,bitIndex574))& (v4823(VarNext,bitIndex81)<->v4569(VarCurr,bitIndex573))& (v4823(VarNext,bitIndex80)<->v4569(VarCurr,bitIndex572))& (v4823(VarNext,bitIndex79)<->v4569(VarCurr,bitIndex571))& (v4823(VarNext,bitIndex78)<->v4569(VarCurr,bitIndex570))& (v4823(VarNext,bitIndex77)<->v4569(VarCurr,bitIndex569))& (v4823(VarNext,bitIndex76)<->v4569(VarCurr,bitIndex568))& (v4823(VarNext,bitIndex75)<->v4569(VarCurr,bitIndex567))& (v4823(VarNext,bitIndex74)<->v4569(VarCurr,bitIndex566))& (v4823(VarNext,bitIndex73)<->v4569(VarCurr,bitIndex565))& (v4823(VarNext,bitIndex72)<->v4569(VarCurr,bitIndex564))& (v4823(VarNext,bitIndex71)<->v4569(VarCurr,bitIndex563))& (v4823(VarNext,bitIndex70)<->v4569(VarCurr,bitIndex562))& (v4823(VarNext,bitIndex69)<->v4569(VarCurr,bitIndex561))& (v4823(VarNext,bitIndex68)<->v4569(VarCurr,bitIndex560))& (v4823(VarNext,bitIndex67)<->v4569(VarCurr,bitIndex559))& (v4823(VarNext,bitIndex66)<->v4569(VarCurr,bitIndex558))& (v4823(VarNext,bitIndex65)<->v4569(VarCurr,bitIndex557))& (v4823(VarNext,bitIndex64)<->v4569(VarCurr,bitIndex556))& (v4823(VarNext,bitIndex63)<->v4569(VarCurr,bitIndex555))& (v4823(VarNext,bitIndex62)<->v4569(VarCurr,bitIndex554))& (v4823(VarNext,bitIndex61)<->v4569(VarCurr,bitIndex553))& (v4823(VarNext,bitIndex60)<->v4569(VarCurr,bitIndex552))& (v4823(VarNext,bitIndex59)<->v4569(VarCurr,bitIndex551))& (v4823(VarNext,bitIndex58)<->v4569(VarCurr,bitIndex550))& (v4823(VarNext,bitIndex57)<->v4569(VarCurr,bitIndex549))& (v4823(VarNext,bitIndex56)<->v4569(VarCurr,bitIndex548))& (v4823(VarNext,bitIndex55)<->v4569(VarCurr,bitIndex547))& (v4823(VarNext,bitIndex54)<->v4569(VarCurr,bitIndex546))& (v4823(VarNext,bitIndex53)<->v4569(VarCurr,bitIndex545))& (v4823(VarNext,bitIndex52)<->v4569(VarCurr,bitIndex544))& (v4823(VarNext,bitIndex51)<->v4569(VarCurr,bitIndex543))& (v4823(VarNext,bitIndex50)<->v4569(VarCurr,bitIndex542))& (v4823(VarNext,bitIndex49)<->v4569(VarCurr,bitIndex541))& (v4823(VarNext,bitIndex48)<->v4569(VarCurr,bitIndex540))& (v4823(VarNext,bitIndex47)<->v4569(VarCurr,bitIndex539))& (v4823(VarNext,bitIndex46)<->v4569(VarCurr,bitIndex538))& (v4823(VarNext,bitIndex45)<->v4569(VarCurr,bitIndex537))& (v4823(VarNext,bitIndex44)<->v4569(VarCurr,bitIndex536))& (v4823(VarNext,bitIndex43)<->v4569(VarCurr,bitIndex535))& (v4823(VarNext,bitIndex42)<->v4569(VarCurr,bitIndex534))& (v4823(VarNext,bitIndex41)<->v4569(VarCurr,bitIndex533))& (v4823(VarNext,bitIndex40)<->v4569(VarCurr,bitIndex532))& (v4823(VarNext,bitIndex39)<->v4569(VarCurr,bitIndex531))& (v4823(VarNext,bitIndex38)<->v4569(VarCurr,bitIndex530))& (v4823(VarNext,bitIndex37)<->v4569(VarCurr,bitIndex529))& (v4823(VarNext,bitIndex36)<->v4569(VarCurr,bitIndex528))& (v4823(VarNext,bitIndex35)<->v4569(VarCurr,bitIndex527))& (v4823(VarNext,bitIndex34)<->v4569(VarCurr,bitIndex526))& (v4823(VarNext,bitIndex33)<->v4569(VarCurr,bitIndex525))& (v4823(VarNext,bitIndex32)<->v4569(VarCurr,bitIndex524))& (v4823(VarNext,bitIndex31)<->v4569(VarCurr,bitIndex523))& (v4823(VarNext,bitIndex30)<->v4569(VarCurr,bitIndex522))& (v4823(VarNext,bitIndex29)<->v4569(VarCurr,bitIndex521))& (v4823(VarNext,bitIndex28)<->v4569(VarCurr,bitIndex520))& (v4823(VarNext,bitIndex27)<->v4569(VarCurr,bitIndex519))& (v4823(VarNext,bitIndex26)<->v4569(VarCurr,bitIndex518))& (v4823(VarNext,bitIndex25)<->v4569(VarCurr,bitIndex517))& (v4823(VarNext,bitIndex24)<->v4569(VarCurr,bitIndex516))& (v4823(VarNext,bitIndex23)<->v4569(VarCurr,bitIndex515))& (v4823(VarNext,bitIndex22)<->v4569(VarCurr,bitIndex514))& (v4823(VarNext,bitIndex21)<->v4569(VarCurr,bitIndex513))& (v4823(VarNext,bitIndex20)<->v4569(VarCurr,bitIndex512))& (v4823(VarNext,bitIndex19)<->v4569(VarCurr,bitIndex511))& (v4823(VarNext,bitIndex18)<->v4569(VarCurr,bitIndex510))& (v4823(VarNext,bitIndex17)<->v4569(VarCurr,bitIndex509))& (v4823(VarNext,bitIndex16)<->v4569(VarCurr,bitIndex508))& (v4823(VarNext,bitIndex15)<->v4569(VarCurr,bitIndex507))& (v4823(VarNext,bitIndex14)<->v4569(VarCurr,bitIndex506))& (v4823(VarNext,bitIndex13)<->v4569(VarCurr,bitIndex505))& (v4823(VarNext,bitIndex12)<->v4569(VarCurr,bitIndex504))& (v4823(VarNext,bitIndex11)<->v4569(VarCurr,bitIndex503))& (v4823(VarNext,bitIndex10)<->v4569(VarCurr,bitIndex502))& (v4823(VarNext,bitIndex9)<->v4569(VarCurr,bitIndex501))& (v4823(VarNext,bitIndex8)<->v4569(VarCurr,bitIndex500))& (v4823(VarNext,bitIndex7)<->v4569(VarCurr,bitIndex499))& (v4823(VarNext,bitIndex6)<->v4569(VarCurr,bitIndex498))& (v4823(VarNext,bitIndex5)<->v4569(VarCurr,bitIndex497))& (v4823(VarNext,bitIndex4)<->v4569(VarCurr,bitIndex496))& (v4823(VarNext,bitIndex3)<->v4569(VarCurr,bitIndex495))& (v4823(VarNext,bitIndex2)<->v4569(VarCurr,bitIndex494))& (v4823(VarNext,bitIndex1)<->v4569(VarCurr,bitIndex493))& (v4823(VarNext,bitIndex0)<->v4569(VarCurr,bitIndex492)))).
% 121.33/120.34  all VarNext (v4825(VarNext)-> (all B (range_122_0(B)-> (v4823(VarNext,B)<->v4725(VarNext,B))))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4825(VarNext)<->v4826(VarNext)&v4706(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4826(VarNext)<->v4828(VarNext)&v4515(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4828(VarNext)<->v4522(VarNext))).
% 121.33/120.34  all VarCurr (v4690(VarCurr,bitIndex81)<->v4695(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4694(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex450)).
% 121.33/120.34  all VarCurr (v4572(VarCurr,bitIndex81)<->v4688(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4681(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex573)).
% 121.33/120.34  all VarCurr (v4624(VarCurr,bitIndex81)<->v4626(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4626(VarCurr,bitIndex81)<->v4628(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4628(VarCurr,bitIndex81)<->v4630(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4630(VarCurr,bitIndex81)<->v4632(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4632(VarCurr,bitIndex81)<->v4634(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4634(VarCurr,bitIndex81)<->v4636(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4636(VarCurr,bitIndex81)<->v4638(VarCurr,bitIndex81)).
% 121.33/120.34  all VarNext (v4638(VarNext,bitIndex81)<->v4815(VarNext,bitIndex81)).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4816(VarNext)-> (all B (range_122_0(B)-> (v4815(VarNext,B)<->v4638(VarCurr,B)))))).
% 121.33/120.34  all VarNext (v4816(VarNext)-> (all B (range_122_0(B)-> (v4815(VarNext,B)<->v4676(VarNext,B))))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4816(VarNext)<->v4817(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4817(VarNext)<->v4819(VarNext)&v4663(VarNext))).
% 121.33/120.34  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4819(VarNext)<->v4670(VarNext))).
% 121.33/120.34  all VarCurr (v4642(VarCurr,bitIndex81)<->v4652(VarCurr,bitIndex81)).
% 121.33/120.34  all VarCurr (v4657(VarCurr,bitIndex4)<->v4812(VarCurr,bitIndex4)).
% 121.33/120.34  all VarCurr (-v4813(VarCurr)-> (all B (range_7_0(B)-> (v4812(VarCurr,B)<->v4811(VarCurr,B))))).
% 121.33/120.34  all VarCurr (v4813(VarCurr)-> (v4812(VarCurr,bitIndex7)<->v103(VarCurr,bitIndex13))& (v4812(VarCurr,bitIndex6)<->v103(VarCurr,bitIndex12))& (v4812(VarCurr,bitIndex5)<->v103(VarCurr,bitIndex11))& (v4812(VarCurr,bitIndex4)<->v103(VarCurr,bitIndex10))& (v4812(VarCurr,bitIndex3)<->v103(VarCurr,bitIndex9))& (v4812(VarCurr,bitIndex2)<->v103(VarCurr,bitIndex8))& (v4812(VarCurr,bitIndex1)<->v103(VarCurr,bitIndex7))& (v4812(VarCurr,bitIndex0)<->v103(VarCurr,bitIndex6))).
% 121.33/120.34  all VarCurr (v4813(VarCurr)<->v4780(VarCurr)|v490(VarCurr)).
% 121.33/120.34  all VarCurr (v4811(VarCurr,bitIndex4)<->v105(VarCurr,bitIndex74)).
% 121.33/120.34  all VarCurr (v105(VarCurr,bitIndex74)<->v107(VarCurr,bitIndex74)).
% 121.33/120.34  all VarCurr (v107(VarCurr,bitIndex74)<->v109(VarCurr,bitIndex74)).
% 121.33/120.34  all VarCurr (v109(VarCurr,bitIndex74)<->v111(VarCurr,bitIndex654)).
% 121.33/120.34  all VarCurr (v103(VarCurr,bitIndex10)<->v105(VarCurr,bitIndex10)).
% 121.33/120.34  all VarCurr (v105(VarCurr,bitIndex10)<->v107(VarCurr,bitIndex10)).
% 121.33/120.34  all VarCurr (v107(VarCurr,bitIndex10)<->v109(VarCurr,bitIndex10)).
% 121.33/120.34  all VarCurr (v109(VarCurr,bitIndex10)<->v111(VarCurr,bitIndex590)).
% 121.33/120.34  all VarCurr (-v4783(VarCurr)-> (v4780(VarCurr)<->$F)).
% 121.33/120.34  all VarCurr (v4783(VarCurr)-> (v4780(VarCurr)<->v4809(VarCurr))).
% 121.33/120.34  all VarCurr (-v4784(VarCurr)-> (v4809(VarCurr)<->$F)).
% 121.33/120.35  all VarCurr (v4784(VarCurr)-> (v4809(VarCurr)<->$T)).
% 121.33/120.35  all VarCurr (v4783(VarCurr)<->v4784(VarCurr)|v4786(VarCurr)).
% 121.33/120.35  all VarCurr (v4786(VarCurr)<->v4787(VarCurr)|v4804(VarCurr)).
% 121.33/120.35  all VarCurr (v4804(VarCurr)<->v4806(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr (v4806(VarCurr)<->v4807(VarCurr)&v367(VarCurr)).
% 121.33/120.35  all VarCurr (v4807(VarCurr)<->v4808(VarCurr)&v366(VarCurr)).
% 121.33/120.35  all VarCurr (v4808(VarCurr)<->v4799(VarCurr)&v170(VarCurr,bitIndex3)).
% 121.33/120.35  all VarCurr (v4787(VarCurr)<->v4788(VarCurr)|v4801(VarCurr)).
% 121.33/120.35  all VarCurr (v4801(VarCurr)<->v4803(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr (v4803(VarCurr)<->v4797(VarCurr)&v170(VarCurr,bitIndex5)).
% 121.33/120.35  all VarCurr (v4788(VarCurr)<->v4789(VarCurr)|v4794(VarCurr)).
% 121.33/120.35  all VarCurr (v4794(VarCurr)<->v4796(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr (v4796(VarCurr)<->v4797(VarCurr)&v367(VarCurr)).
% 121.33/120.35  all VarCurr (v4797(VarCurr)<->v4798(VarCurr)&v366(VarCurr)).
% 121.33/120.35  all VarCurr (v4798(VarCurr)<->v4799(VarCurr)&v365(VarCurr)).
% 121.33/120.35  all VarCurr (v4799(VarCurr)<->v4800(VarCurr)&v364(VarCurr)).
% 121.33/120.35  all VarCurr (v4800(VarCurr)<->v170(VarCurr,bitIndex0)&v363(VarCurr)).
% 121.33/120.35  all VarCurr (v4789(VarCurr)<->v4790(VarCurr)|v4792(VarCurr)).
% 121.33/120.35  all VarCurr (v4792(VarCurr)<->v372(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr (v4790(VarCurr)<->v357(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr (v4784(VarCurr)<->v495(VarCurr)&v417(VarCurr)).
% 121.33/120.35  all VarCurr ((v4564(VarCurr,bitIndex6)<->v2244(VarCurr,bitIndex122))& (v4564(VarCurr,bitIndex5)<->v2244(VarCurr,bitIndex121))& (v4564(VarCurr,bitIndex4)<->v2244(VarCurr,bitIndex120))& (v4564(VarCurr,bitIndex3)<->v2244(VarCurr,bitIndex119))& (v4564(VarCurr,bitIndex2)<->v2244(VarCurr,bitIndex118))& (v4564(VarCurr,bitIndex1)<->v2244(VarCurr,bitIndex117))& (v4564(VarCurr,bitIndex0)<->v2244(VarCurr,bitIndex116))).
% 121.33/120.35  all VarCurr B (range_122_116(B)-> (v2244(VarCurr,B)<->v4567(VarCurr,B))).
% 121.33/120.35  all VarCurr ((v4567(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex737))& (v4567(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex736))& (v4567(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex735))& (v4567(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex734))& (v4567(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex733))& (v4567(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex732))& (v4567(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex731))).
% 121.33/120.35  all VarNext ((v4569(VarNext,bitIndex737)<->v4744(VarNext,bitIndex122))& (v4569(VarNext,bitIndex736)<->v4744(VarNext,bitIndex121))& (v4569(VarNext,bitIndex735)<->v4744(VarNext,bitIndex120))& (v4569(VarNext,bitIndex734)<->v4744(VarNext,bitIndex119))& (v4569(VarNext,bitIndex733)<->v4744(VarNext,bitIndex118))& (v4569(VarNext,bitIndex732)<->v4744(VarNext,bitIndex117))& (v4569(VarNext,bitIndex731)<->v4744(VarNext,bitIndex116))).
% 121.33/120.35  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4745(VarNext)-> (v4744(VarNext,bitIndex122)<->v4569(VarCurr,bitIndex737))& (v4744(VarNext,bitIndex121)<->v4569(VarCurr,bitIndex736))& (v4744(VarNext,bitIndex120)<->v4569(VarCurr,bitIndex735))& (v4744(VarNext,bitIndex119)<->v4569(VarCurr,bitIndex734))& (v4744(VarNext,bitIndex118)<->v4569(VarCurr,bitIndex733))& (v4744(VarNext,bitIndex117)<->v4569(VarCurr,bitIndex732))& (v4744(VarNext,bitIndex116)<->v4569(VarCurr,bitIndex731))& (v4744(VarNext,bitIndex115)<->v4569(VarCurr,bitIndex730))& (v4744(VarNext,bitIndex114)<->v4569(VarCurr,bitIndex729))& (v4744(VarNext,bitIndex113)<->v4569(VarCurr,bitIndex728))& (v4744(VarNext,bitIndex112)<->v4569(VarCurr,bitIndex727))& (v4744(VarNext,bitIndex111)<->v4569(VarCurr,bitIndex726))& (v4744(VarNext,bitIndex110)<->v4569(VarCurr,bitIndex725))& (v4744(VarNext,bitIndex109)<->v4569(VarCurr,bitIndex724))& (v4744(VarNext,bitIndex108)<->v4569(VarCurr,bitIndex723))& (v4744(VarNext,bitIndex107)<->v4569(VarCurr,bitIndex722))& (v4744(VarNext,bitIndex106)<->v4569(VarCurr,bitIndex721))& (v4744(VarNext,bitIndex105)<->v4569(VarCurr,bitIndex720))& (v4744(VarNext,bitIndex104)<->v4569(VarCurr,bitIndex719))& (v4744(VarNext,bitIndex103)<->v4569(VarCurr,bitIndex718))& (v4744(VarNext,bitIndex102)<->v4569(VarCurr,bitIndex717))& (v4744(VarNext,bitIndex101)<->v4569(VarCurr,bitIndex716))& (v4744(VarNext,bitIndex100)<->v4569(VarCurr,bitIndex715))& (v4744(VarNext,bitIndex99)<->v4569(VarCurr,bitIndex714))& (v4744(VarNext,bitIndex98)<->v4569(VarCurr,bitIndex713))& (v4744(VarNext,bitIndex97)<->v4569(VarCurr,bitIndex712))& (v4744(VarNext,bitIndex96)<->v4569(VarCurr,bitIndex711))& (v4744(VarNext,bitIndex95)<->v4569(VarCurr,bitIndex710))& (v4744(VarNext,bitIndex94)<->v4569(VarCurr,bitIndex709))& (v4744(VarNext,bitIndex93)<->v4569(VarCurr,bitIndex708))& (v4744(VarNext,bitIndex92)<->v4569(VarCurr,bitIndex707))& (v4744(VarNext,bitIndex91)<->v4569(VarCurr,bitIndex706))& (v4744(VarNext,bitIndex90)<->v4569(VarCurr,bitIndex705))& (v4744(VarNext,bitIndex89)<->v4569(VarCurr,bitIndex704))& (v4744(VarNext,bitIndex88)<->v4569(VarCurr,bitIndex703))& (v4744(VarNext,bitIndex87)<->v4569(VarCurr,bitIndex702))& (v4744(VarNext,bitIndex86)<->v4569(VarCurr,bitIndex701))& (v4744(VarNext,bitIndex85)<->v4569(VarCurr,bitIndex700))& (v4744(VarNext,bitIndex84)<->v4569(VarCurr,bitIndex699))& (v4744(VarNext,bitIndex83)<->v4569(VarCurr,bitIndex698))& (v4744(VarNext,bitIndex82)<->v4569(VarCurr,bitIndex697))& (v4744(VarNext,bitIndex81)<->v4569(VarCurr,bitIndex696))& (v4744(VarNext,bitIndex80)<->v4569(VarCurr,bitIndex695))& (v4744(VarNext,bitIndex79)<->v4569(VarCurr,bitIndex694))& (v4744(VarNext,bitIndex78)<->v4569(VarCurr,bitIndex693))& (v4744(VarNext,bitIndex77)<->v4569(VarCurr,bitIndex692))& (v4744(VarNext,bitIndex76)<->v4569(VarCurr,bitIndex691))& (v4744(VarNext,bitIndex75)<->v4569(VarCurr,bitIndex690))& (v4744(VarNext,bitIndex74)<->v4569(VarCurr,bitIndex689))& (v4744(VarNext,bitIndex73)<->v4569(VarCurr,bitIndex688))& (v4744(VarNext,bitIndex72)<->v4569(VarCurr,bitIndex687))& (v4744(VarNext,bitIndex71)<->v4569(VarCurr,bitIndex686))& (v4744(VarNext,bitIndex70)<->v4569(VarCurr,bitIndex685))& (v4744(VarNext,bitIndex69)<->v4569(VarCurr,bitIndex684))& (v4744(VarNext,bitIndex68)<->v4569(VarCurr,bitIndex683))& (v4744(VarNext,bitIndex67)<->v4569(VarCurr,bitIndex682))& (v4744(VarNext,bitIndex66)<->v4569(VarCurr,bitIndex681))& (v4744(VarNext,bitIndex65)<->v4569(VarCurr,bitIndex680))& (v4744(VarNext,bitIndex64)<->v4569(VarCurr,bitIndex679))& (v4744(VarNext,bitIndex63)<->v4569(VarCurr,bitIndex678))& (v4744(VarNext,bitIndex62)<->v4569(VarCurr,bitIndex677))& (v4744(VarNext,bitIndex61)<->v4569(VarCurr,bitIndex676))& (v4744(VarNext,bitIndex60)<->v4569(VarCurr,bitIndex675))& (v4744(VarNext,bitIndex59)<->v4569(VarCurr,bitIndex674))& (v4744(VarNext,bitIndex58)<->v4569(VarCurr,bitIndex673))& (v4744(VarNext,bitIndex57)<->v4569(VarCurr,bitIndex672))& (v4744(VarNext,bitIndex56)<->v4569(VarCurr,bitIndex671))& (v4744(VarNext,bitIndex55)<->v4569(VarCurr,bitIndex670))& (v4744(VarNext,bitIndex54)<->v4569(VarCurr,bitIndex669))& (v4744(VarNext,bitIndex53)<->v4569(VarCurr,bitIndex668))& (v4744(VarNext,bitIndex52)<->v4569(VarCurr,bitIndex667))& (v4744(VarNext,bitIndex51)<->v4569(VarCurr,bitIndex666))& (v4744(VarNext,bitIndex50)<->v4569(VarCurr,bitIndex665))& (v4744(VarNext,bitIndex49)<->v4569(VarCurr,bitIndex664))& (v4744(VarNext,bitIndex48)<->v4569(VarCurr,bitIndex663))& (v4744(VarNext,bitIndex47)<->v4569(VarCurr,bitIndex662))& (v4744(VarNext,bitIndex46)<->v4569(VarCurr,bitIndex661))& (v4744(VarNext,bitIndex45)<->v4569(VarCurr,bitIndex660))& (v4744(VarNext,bitIndex44)<->v4569(VarCurr,bitIndex659))& (v4744(VarNext,bitIndex43)<->v4569(VarCurr,bitIndex658))& (v4744(VarNext,bitIndex42)<->v4569(VarCurr,bitIndex657))& (v4744(VarNext,bitIndex41)<->v4569(VarCurr,bitIndex656))& (v4744(VarNext,bitIndex40)<->v4569(VarCurr,bitIndex655))& (v4744(VarNext,bitIndex39)<->v4569(VarCurr,bitIndex654))& (v4744(VarNext,bitIndex38)<->v4569(VarCurr,bitIndex653))& (v4744(VarNext,bitIndex37)<->v4569(VarCurr,bitIndex652))& (v4744(VarNext,bitIndex36)<->v4569(VarCurr,bitIndex651))& (v4744(VarNext,bitIndex35)<->v4569(VarCurr,bitIndex650))& (v4744(VarNext,bitIndex34)<->v4569(VarCurr,bitIndex649))& (v4744(VarNext,bitIndex33)<->v4569(VarCurr,bitIndex648))& (v4744(VarNext,bitIndex32)<->v4569(VarCurr,bitIndex647))& (v4744(VarNext,bitIndex31)<->v4569(VarCurr,bitIndex646))& (v4744(VarNext,bitIndex30)<->v4569(VarCurr,bitIndex645))& (v4744(VarNext,bitIndex29)<->v4569(VarCurr,bitIndex644))& (v4744(VarNext,bitIndex28)<->v4569(VarCurr,bitIndex643))& (v4744(VarNext,bitIndex27)<->v4569(VarCurr,bitIndex642))& (v4744(VarNext,bitIndex26)<->v4569(VarCurr,bitIndex641))& (v4744(VarNext,bitIndex25)<->v4569(VarCurr,bitIndex640))& (v4744(VarNext,bitIndex24)<->v4569(VarCurr,bitIndex639))& (v4744(VarNext,bitIndex23)<->v4569(VarCurr,bitIndex638))& (v4744(VarNext,bitIndex22)<->v4569(VarCurr,bitIndex637))& (v4744(VarNext,bitIndex21)<->v4569(VarCurr,bitIndex636))& (v4744(VarNext,bitIndex20)<->v4569(VarCurr,bitIndex635))& (v4744(VarNext,bitIndex19)<->v4569(VarCurr,bitIndex634))& (v4744(VarNext,bitIndex18)<->v4569(VarCurr,bitIndex633))& (v4744(VarNext,bitIndex17)<->v4569(VarCurr,bitIndex632))& (v4744(VarNext,bitIndex16)<->v4569(VarCurr,bitIndex631))& (v4744(VarNext,bitIndex15)<->v4569(VarCurr,bitIndex630))& (v4744(VarNext,bitIndex14)<->v4569(VarCurr,bitIndex629))& (v4744(VarNext,bitIndex13)<->v4569(VarCurr,bitIndex628))& (v4744(VarNext,bitIndex12)<->v4569(VarCurr,bitIndex627))& (v4744(VarNext,bitIndex11)<->v4569(VarCurr,bitIndex626))& (v4744(VarNext,bitIndex10)<->v4569(VarCurr,bitIndex625))& (v4744(VarNext,bitIndex9)<->v4569(VarCurr,bitIndex624))& (v4744(VarNext,bitIndex8)<->v4569(VarCurr,bitIndex623))& (v4744(VarNext,bitIndex7)<->v4569(VarCurr,bitIndex622))& (v4744(VarNext,bitIndex6)<->v4569(VarCurr,bitIndex621))& (v4744(VarNext,bitIndex5)<->v4569(VarCurr,bitIndex620))& (v4744(VarNext,bitIndex4)<->v4569(VarCurr,bitIndex619))& (v4744(VarNext,bitIndex3)<->v4569(VarCurr,bitIndex618))& (v4744(VarNext,bitIndex2)<->v4569(VarCurr,bitIndex617))& (v4744(VarNext,bitIndex1)<->v4569(VarCurr,bitIndex616))& (v4744(VarNext,bitIndex0)<->v4569(VarCurr,bitIndex615)))).
% 121.33/120.35  all VarNext (v4745(VarNext)-> (all B (range_122_0(B)-> (v4744(VarNext,B)<->v4771(VarNext,B))))).
% 121.33/120.35  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_122_0(B)-> (v4771(VarNext,B)<->v4769(VarCurr,B))))).
% 121.33/120.35  all VarCurr (-v4708(VarCurr)-> (all B (range_122_0(B)-> (v4769(VarCurr,B)<->v4772(VarCurr,B))))).
% 121.33/120.35  all VarCurr (v4708(VarCurr)-> (all B (range_122_0(B)-> (v4769(VarCurr,B)<->$F)))).
% 121.33/120.35  all VarCurr (-v4758(VarCurr)& -v4760(VarCurr)-> (all B (range_122_0(B)-> (v4772(VarCurr,B)<->v4737(VarCurr,B))))).
% 121.33/120.35  all VarCurr (v4760(VarCurr)-> (all B (range_122_0(B)-> (v4772(VarCurr,B)<->v4730(VarCurr,B))))).
% 121.33/120.35  all VarCurr (v4758(VarCurr)-> (v4772(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex614))& (v4772(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex613))& (v4772(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex612))& (v4772(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex611))& (v4772(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex610))& (v4772(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex609))& (v4772(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex608))& (v4772(VarCurr,bitIndex115)<->v4569(VarCurr,bitIndex607))& (v4772(VarCurr,bitIndex114)<->v4569(VarCurr,bitIndex606))& (v4772(VarCurr,bitIndex113)<->v4569(VarCurr,bitIndex605))& (v4772(VarCurr,bitIndex112)<->v4569(VarCurr,bitIndex604))& (v4772(VarCurr,bitIndex111)<->v4569(VarCurr,bitIndex603))& (v4772(VarCurr,bitIndex110)<->v4569(VarCurr,bitIndex602))& (v4772(VarCurr,bitIndex109)<->v4569(VarCurr,bitIndex601))& (v4772(VarCurr,bitIndex108)<->v4569(VarCurr,bitIndex600))& (v4772(VarCurr,bitIndex107)<->v4569(VarCurr,bitIndex599))& (v4772(VarCurr,bitIndex106)<->v4569(VarCurr,bitIndex598))& (v4772(VarCurr,bitIndex105)<->v4569(VarCurr,bitIndex597))& (v4772(VarCurr,bitIndex104)<->v4569(VarCurr,bitIndex596))& (v4772(VarCurr,bitIndex103)<->v4569(VarCurr,bitIndex595))& (v4772(VarCurr,bitIndex102)<->v4569(VarCurr,bitIndex594))& (v4772(VarCurr,bitIndex101)<->v4569(VarCurr,bitIndex593))& (v4772(VarCurr,bitIndex100)<->v4569(VarCurr,bitIndex592))& (v4772(VarCurr,bitIndex99)<->v4569(VarCurr,bitIndex591))& (v4772(VarCurr,bitIndex98)<->v4569(VarCurr,bitIndex590))& (v4772(VarCurr,bitIndex97)<->v4569(VarCurr,bitIndex589))& (v4772(VarCurr,bitIndex96)<->v4569(VarCurr,bitIndex588))& (v4772(VarCurr,bitIndex95)<->v4569(VarCurr,bitIndex587))& (v4772(VarCurr,bitIndex94)<->v4569(VarCurr,bitIndex586))& (v4772(VarCurr,bitIndex93)<->v4569(VarCurr,bitIndex585))& (v4772(VarCurr,bitIndex92)<->v4569(VarCurr,bitIndex584))& (v4772(VarCurr,bitIndex91)<->v4569(VarCurr,bitIndex583))& (v4772(VarCurr,bitIndex90)<->v4569(VarCurr,bitIndex582))& (v4772(VarCurr,bitIndex89)<->v4569(VarCurr,bitIndex581))& (v4772(VarCurr,bitIndex88)<->v4569(VarCurr,bitIndex580))& (v4772(VarCurr,bitIndex87)<->v4569(VarCurr,bitIndex579))& (v4772(VarCurr,bitIndex86)<->v4569(VarCurr,bitIndex578))& (v4772(VarCurr,bitIndex85)<->v4569(VarCurr,bitIndex577))& (v4772(VarCurr,bitIndex84)<->v4569(VarCurr,bitIndex576))& (v4772(VarCurr,bitIndex83)<->v4569(VarCurr,bitIndex575))& (v4772(VarCurr,bitIndex82)<->v4569(VarCurr,bitIndex574))& (v4772(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex573))& (v4772(VarCurr,bitIndex80)<->v4569(VarCurr,bitIndex572))& (v4772(VarCurr,bitIndex79)<->v4569(VarCurr,bitIndex571))& (v4772(VarCurr,bitIndex78)<->v4569(VarCurr,bitIndex570))& (v4772(VarCurr,bitIndex77)<->v4569(VarCurr,bitIndex569))& (v4772(VarCurr,bitIndex76)<->v4569(VarCurr,bitIndex568))& (v4772(VarCurr,bitIndex75)<->v4569(VarCurr,bitIndex567))& (v4772(VarCurr,bitIndex74)<->v4569(VarCurr,bitIndex566))& (v4772(VarCurr,bitIndex73)<->v4569(VarCurr,bitIndex565))& (v4772(VarCurr,bitIndex72)<->v4569(VarCurr,bitIndex564))& (v4772(VarCurr,bitIndex71)<->v4569(VarCurr,bitIndex563))& (v4772(VarCurr,bitIndex70)<->v4569(VarCurr,bitIndex562))& (v4772(VarCurr,bitIndex69)<->v4569(VarCurr,bitIndex561))& (v4772(VarCurr,bitIndex68)<->v4569(VarCurr,bitIndex560))& (v4772(VarCurr,bitIndex67)<->v4569(VarCurr,bitIndex559))& (v4772(VarCurr,bitIndex66)<->v4569(VarCurr,bitIndex558))& (v4772(VarCurr,bitIndex65)<->v4569(VarCurr,bitIndex557))& (v4772(VarCurr,bitIndex64)<->v4569(VarCurr,bitIndex556))& (v4772(VarCurr,bitIndex63)<->v4569(VarCurr,bitIndex555))& (v4772(VarCurr,bitIndex62)<->v4569(VarCurr,bitIndex554))& (v4772(VarCurr,bitIndex61)<->v4569(VarCurr,bitIndex553))& (v4772(VarCurr,bitIndex60)<->v4569(VarCurr,bitIndex552))& (v4772(VarCurr,bitIndex59)<->v4569(VarCurr,bitIndex551))& (v4772(VarCurr,bitIndex58)<->v4569(VarCurr,bitIndex550))& (v4772(VarCurr,bitIndex57)<->v4569(VarCurr,bitIndex549))& (v4772(VarCurr,bitIndex56)<->v4569(VarCurr,bitIndex548))& (v4772(VarCurr,bitIndex55)<->v4569(VarCurr,bitIndex547))& (v4772(VarCurr,bitIndex54)<->v4569(VarCurr,bitIndex546))& (v4772(VarCurr,bitIndex53)<->v4569(VarCurr,bitIndex545))& (v4772(VarCurr,bitIndex52)<->v4569(VarCurr,bitIndex544))& (v4772(VarCurr,bitIndex51)<->v4569(VarCurr,bitIndex543))& (v4772(VarCurr,bitIndex50)<->v4569(VarCurr,bitIndex542))& (v4772(VarCurr,bitIndex49)<->v4569(VarCurr,bitIndex541))& (v4772(VarCurr,bitIndex48)<->v4569(VarCurr,bitIndex540))& (v4772(VarCurr,bitIndex47)<->v4569(VarCurr,bitIndex539))& (v4772(VarCurr,bitIndex46)<->v4569(VarCurr,bitIndex538))& (v4772(VarCurr,bitIndex45)<->v4569(VarCurr,bitIndex537))& (v4772(VarCurr,bitIndex44)<->v4569(VarCurr,bitIndex536))& (v4772(VarCurr,bitIndex43)<->v4569(VarCurr,bitIndex535))& (v4772(VarCurr,bitIndex42)<->v4569(VarCurr,bitIndex534))& (v4772(VarCurr,bitIndex41)<->v4569(VarCurr,bitIndex533))& (v4772(VarCurr,bitIndex40)<->v4569(VarCurr,bitIndex532))& (v4772(VarCurr,bitIndex39)<->v4569(VarCurr,bitIndex531))& (v4772(VarCurr,bitIndex38)<->v4569(VarCurr,bitIndex530))& (v4772(VarCurr,bitIndex37)<->v4569(VarCurr,bitIndex529))& (v4772(VarCurr,bitIndex36)<->v4569(VarCurr,bitIndex528))& (v4772(VarCurr,bitIndex35)<->v4569(VarCurr,bitIndex527))& (v4772(VarCurr,bitIndex34)<->v4569(VarCurr,bitIndex526))& (v4772(VarCurr,bitIndex33)<->v4569(VarCurr,bitIndex525))& (v4772(VarCurr,bitIndex32)<->v4569(VarCurr,bitIndex524))& (v4772(VarCurr,bitIndex31)<->v4569(VarCurr,bitIndex523))& (v4772(VarCurr,bitIndex30)<->v4569(VarCurr,bitIndex522))& (v4772(VarCurr,bitIndex29)<->v4569(VarCurr,bitIndex521))& (v4772(VarCurr,bitIndex28)<->v4569(VarCurr,bitIndex520))& (v4772(VarCurr,bitIndex27)<->v4569(VarCurr,bitIndex519))& (v4772(VarCurr,bitIndex26)<->v4569(VarCurr,bitIndex518))& (v4772(VarCurr,bitIndex25)<->v4569(VarCurr,bitIndex517))& (v4772(VarCurr,bitIndex24)<->v4569(VarCurr,bitIndex516))& (v4772(VarCurr,bitIndex23)<->v4569(VarCurr,bitIndex515))& (v4772(VarCurr,bitIndex22)<->v4569(VarCurr,bitIndex514))& (v4772(VarCurr,bitIndex21)<->v4569(VarCurr,bitIndex513))& (v4772(VarCurr,bitIndex20)<->v4569(VarCurr,bitIndex512))& (v4772(VarCurr,bitIndex19)<->v4569(VarCurr,bitIndex511))& (v4772(VarCurr,bitIndex18)<->v4569(VarCurr,bitIndex510))& (v4772(VarCurr,bitIndex17)<->v4569(VarCurr,bitIndex509))& (v4772(VarCurr,bitIndex16)<->v4569(VarCurr,bitIndex508))& (v4772(VarCurr,bitIndex15)<->v4569(VarCurr,bitIndex507))& (v4772(VarCurr,bitIndex14)<->v4569(VarCurr,bitIndex506))& (v4772(VarCurr,bitIndex13)<->v4569(VarCurr,bitIndex505))& (v4772(VarCurr,bitIndex12)<->v4569(VarCurr,bitIndex504))& (v4772(VarCurr,bitIndex11)<->v4569(VarCurr,bitIndex503))& (v4772(VarCurr,bitIndex10)<->v4569(VarCurr,bitIndex502))& (v4772(VarCurr,bitIndex9)<->v4569(VarCurr,bitIndex501))& (v4772(VarCurr,bitIndex8)<->v4569(VarCurr,bitIndex500))& (v4772(VarCurr,bitIndex7)<->v4569(VarCurr,bitIndex499))& (v4772(VarCurr,bitIndex6)<->v4569(VarCurr,bitIndex498))& (v4772(VarCurr,bitIndex5)<->v4569(VarCurr,bitIndex497))& (v4772(VarCurr,bitIndex4)<->v4569(VarCurr,bitIndex496))& (v4772(VarCurr,bitIndex3)<->v4569(VarCurr,bitIndex495))& (v4772(VarCurr,bitIndex2)<->v4569(VarCurr,bitIndex494))& (v4772(VarCurr,bitIndex1)<->v4569(VarCurr,bitIndex493))& (v4772(VarCurr,bitIndex0)<->v4569(VarCurr,bitIndex492))).
% 121.33/120.36  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4745(VarNext)<->v4746(VarNext)&v4753(VarNext))).
% 121.33/120.36  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4753(VarNext)<->v4751(VarCurr))).
% 121.33/120.36  all VarCurr (v4751(VarCurr)<->v4754(VarCurr)&v4765(VarCurr)).
% 121.33/120.36  all VarCurr (v4765(VarCurr)<->v4766(VarCurr)|v4708(VarCurr)).
% 121.33/120.36  all VarCurr (-v4766(VarCurr)<->v4767(VarCurr)).
% 121.33/120.36  all VarCurr (v4767(VarCurr)<-> (v4768(VarCurr,bitIndex1)<->$F)& (v4768(VarCurr,bitIndex0)<->$F)).
% 121.33/120.36  all VarCurr (v4768(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.36  all VarCurr (v4768(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.36  all VarCurr (v4754(VarCurr)<->v4708(VarCurr)|v4755(VarCurr)).
% 121.33/120.36  all VarCurr (v4755(VarCurr)<->v4756(VarCurr)&v4764(VarCurr)).
% 121.33/120.36  all VarCurr (-v4764(VarCurr)<->v4708(VarCurr)).
% 121.33/120.36  all VarCurr (v4756(VarCurr)<->v4757(VarCurr)|v4762(VarCurr)).
% 121.33/120.36  all VarCurr (v4762(VarCurr)<-> (v4763(VarCurr,bitIndex1)<->$T)& (v4763(VarCurr,bitIndex0)<->$T)).
% 121.33/120.36  all VarCurr (v4763(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.36  all VarCurr (v4763(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.36  all VarCurr (v4757(VarCurr)<->v4758(VarCurr)|v4760(VarCurr)).
% 121.33/120.36  all VarCurr (v4760(VarCurr)<-> (v4761(VarCurr,bitIndex1)<->$T)& (v4761(VarCurr,bitIndex0)<->$F)).
% 121.33/120.36  all VarCurr (v4761(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.36  all VarCurr (v4761(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.36  all VarCurr (v4758(VarCurr)<-> (v4759(VarCurr,bitIndex1)<->$F)& (v4759(VarCurr,bitIndex0)<->$T)).
% 121.33/120.36  all VarCurr (v4759(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.36  all VarCurr (v4759(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.36  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4746(VarNext)<->v4748(VarNext)&v4515(VarNext))).
% 121.33/120.36  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4748(VarNext)<->v4522(VarNext))).
% 121.33/120.36  all VarCurr B (range_122_116(B)-> (v4737(VarCurr,B)<->v4742(VarCurr,B))).
% 121.33/120.36  all VarCurr (-v4739(VarCurr)-> (all B (range_122_0(B)-> (v4742(VarCurr,B)<->v4741(VarCurr,B))))).
% 121.33/120.36  all VarCurr (v4739(VarCurr)-> (all B (range_122_0(B)-> (v4742(VarCurr,B)<->v4624(VarCurr,B))))).
% 121.33/120.36  all VarCurr ((v4741(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex614))& (v4741(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex613))& (v4741(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex612))& (v4741(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex611))& (v4741(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex610))& (v4741(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex609))& (v4741(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex608))).
% 121.33/120.36  all VarCurr (v4739(VarCurr)<->v4576(VarCurr,bitIndex1)).
% 121.33/120.36  all VarCurr B (range_122_116(B)-> (v4730(VarCurr,B)<->v4735(VarCurr,B))).
% 121.33/120.36  all VarCurr (-v4732(VarCurr)-> (all B (range_122_0(B)-> (v4735(VarCurr,B)<->v4734(VarCurr,B))))).
% 121.33/120.36  all VarCurr (v4732(VarCurr)-> (all B (range_122_0(B)-> (v4735(VarCurr,B)<->v4624(VarCurr,B))))).
% 121.33/120.36  all VarCurr ((v4734(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex737))& (v4734(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex736))& (v4734(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex735))& (v4734(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex734))& (v4734(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex733))& (v4734(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex732))& (v4734(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex731))).
% 121.33/120.36  all VarCurr (v4732(VarCurr)<->v4576(VarCurr,bitIndex1)).
% 121.33/120.36  all VarNext ((v4569(VarNext,bitIndex614)<->v4697(VarNext,bitIndex122))& (v4569(VarNext,bitIndex613)<->v4697(VarNext,bitIndex121))& (v4569(VarNext,bitIndex612)<->v4697(VarNext,bitIndex120))& (v4569(VarNext,bitIndex611)<->v4697(VarNext,bitIndex119))& (v4569(VarNext,bitIndex610)<->v4697(VarNext,bitIndex118))& (v4569(VarNext,bitIndex609)<->v4697(VarNext,bitIndex117))& (v4569(VarNext,bitIndex608)<->v4697(VarNext,bitIndex116))).
% 121.33/120.36  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4698(VarNext)-> (v4697(VarNext,bitIndex122)<->v4569(VarCurr,bitIndex614))& (v4697(VarNext,bitIndex121)<->v4569(VarCurr,bitIndex613))& (v4697(VarNext,bitIndex120)<->v4569(VarCurr,bitIndex612))& (v4697(VarNext,bitIndex119)<->v4569(VarCurr,bitIndex611))& (v4697(VarNext,bitIndex118)<->v4569(VarCurr,bitIndex610))& (v4697(VarNext,bitIndex117)<->v4569(VarCurr,bitIndex609))& (v4697(VarNext,bitIndex116)<->v4569(VarCurr,bitIndex608))& (v4697(VarNext,bitIndex115)<->v4569(VarCurr,bitIndex607))& (v4697(VarNext,bitIndex114)<->v4569(VarCurr,bitIndex606))& (v4697(VarNext,bitIndex113)<->v4569(VarCurr,bitIndex605))& (v4697(VarNext,bitIndex112)<->v4569(VarCurr,bitIndex604))& (v4697(VarNext,bitIndex111)<->v4569(VarCurr,bitIndex603))& (v4697(VarNext,bitIndex110)<->v4569(VarCurr,bitIndex602))& (v4697(VarNext,bitIndex109)<->v4569(VarCurr,bitIndex601))& (v4697(VarNext,bitIndex108)<->v4569(VarCurr,bitIndex600))& (v4697(VarNext,bitIndex107)<->v4569(VarCurr,bitIndex599))& (v4697(VarNext,bitIndex106)<->v4569(VarCurr,bitIndex598))& (v4697(VarNext,bitIndex105)<->v4569(VarCurr,bitIndex597))& (v4697(VarNext,bitIndex104)<->v4569(VarCurr,bitIndex596))& (v4697(VarNext,bitIndex103)<->v4569(VarCurr,bitIndex595))& (v4697(VarNext,bitIndex102)<->v4569(VarCurr,bitIndex594))& (v4697(VarNext,bitIndex101)<->v4569(VarCurr,bitIndex593))& (v4697(VarNext,bitIndex100)<->v4569(VarCurr,bitIndex592))& (v4697(VarNext,bitIndex99)<->v4569(VarCurr,bitIndex591))& (v4697(VarNext,bitIndex98)<->v4569(VarCurr,bitIndex590))& (v4697(VarNext,bitIndex97)<->v4569(VarCurr,bitIndex589))& (v4697(VarNext,bitIndex96)<->v4569(VarCurr,bitIndex588))& (v4697(VarNext,bitIndex95)<->v4569(VarCurr,bitIndex587))& (v4697(VarNext,bitIndex94)<->v4569(VarCurr,bitIndex586))& (v4697(VarNext,bitIndex93)<->v4569(VarCurr,bitIndex585))& (v4697(VarNext,bitIndex92)<->v4569(VarCurr,bitIndex584))& (v4697(VarNext,bitIndex91)<->v4569(VarCurr,bitIndex583))& (v4697(VarNext,bitIndex90)<->v4569(VarCurr,bitIndex582))& (v4697(VarNext,bitIndex89)<->v4569(VarCurr,bitIndex581))& (v4697(VarNext,bitIndex88)<->v4569(VarCurr,bitIndex580))& (v4697(VarNext,bitIndex87)<->v4569(VarCurr,bitIndex579))& (v4697(VarNext,bitIndex86)<->v4569(VarCurr,bitIndex578))& (v4697(VarNext,bitIndex85)<->v4569(VarCurr,bitIndex577))& (v4697(VarNext,bitIndex84)<->v4569(VarCurr,bitIndex576))& (v4697(VarNext,bitIndex83)<->v4569(VarCurr,bitIndex575))& (v4697(VarNext,bitIndex82)<->v4569(VarCurr,bitIndex574))& (v4697(VarNext,bitIndex81)<->v4569(VarCurr,bitIndex573))& (v4697(VarNext,bitIndex80)<->v4569(VarCurr,bitIndex572))& (v4697(VarNext,bitIndex79)<->v4569(VarCurr,bitIndex571))& (v4697(VarNext,bitIndex78)<->v4569(VarCurr,bitIndex570))& (v4697(VarNext,bitIndex77)<->v4569(VarCurr,bitIndex569))& (v4697(VarNext,bitIndex76)<->v4569(VarCurr,bitIndex568))& (v4697(VarNext,bitIndex75)<->v4569(VarCurr,bitIndex567))& (v4697(VarNext,bitIndex74)<->v4569(VarCurr,bitIndex566))& (v4697(VarNext,bitIndex73)<->v4569(VarCurr,bitIndex565))& (v4697(VarNext,bitIndex72)<->v4569(VarCurr,bitIndex564))& (v4697(VarNext,bitIndex71)<->v4569(VarCurr,bitIndex563))& (v4697(VarNext,bitIndex70)<->v4569(VarCurr,bitIndex562))& (v4697(VarNext,bitIndex69)<->v4569(VarCurr,bitIndex561))& (v4697(VarNext,bitIndex68)<->v4569(VarCurr,bitIndex560))& (v4697(VarNext,bitIndex67)<->v4569(VarCurr,bitIndex559))& (v4697(VarNext,bitIndex66)<->v4569(VarCurr,bitIndex558))& (v4697(VarNext,bitIndex65)<->v4569(VarCurr,bitIndex557))& (v4697(VarNext,bitIndex64)<->v4569(VarCurr,bitIndex556))& (v4697(VarNext,bitIndex63)<->v4569(VarCurr,bitIndex555))& (v4697(VarNext,bitIndex62)<->v4569(VarCurr,bitIndex554))& (v4697(VarNext,bitIndex61)<->v4569(VarCurr,bitIndex553))& (v4697(VarNext,bitIndex60)<->v4569(VarCurr,bitIndex552))& (v4697(VarNext,bitIndex59)<->v4569(VarCurr,bitIndex551))& (v4697(VarNext,bitIndex58)<->v4569(VarCurr,bitIndex550))& (v4697(VarNext,bitIndex57)<->v4569(VarCurr,bitIndex549))& (v4697(VarNext,bitIndex56)<->v4569(VarCurr,bitIndex548))& (v4697(VarNext,bitIndex55)<->v4569(VarCurr,bitIndex547))& (v4697(VarNext,bitIndex54)<->v4569(VarCurr,bitIndex546))& (v4697(VarNext,bitIndex53)<->v4569(VarCurr,bitIndex545))& (v4697(VarNext,bitIndex52)<->v4569(VarCurr,bitIndex544))& (v4697(VarNext,bitIndex51)<->v4569(VarCurr,bitIndex543))& (v4697(VarNext,bitIndex50)<->v4569(VarCurr,bitIndex542))& (v4697(VarNext,bitIndex49)<->v4569(VarCurr,bitIndex541))& (v4697(VarNext,bitIndex48)<->v4569(VarCurr,bitIndex540))& (v4697(VarNext,bitIndex47)<->v4569(VarCurr,bitIndex539))& (v4697(VarNext,bitIndex46)<->v4569(VarCurr,bitIndex538))& (v4697(VarNext,bitIndex45)<->v4569(VarCurr,bitIndex537))& (v4697(VarNext,bitIndex44)<->v4569(VarCurr,bitIndex536))& (v4697(VarNext,bitIndex43)<->v4569(VarCurr,bitIndex535))& (v4697(VarNext,bitIndex42)<->v4569(VarCurr,bitIndex534))& (v4697(VarNext,bitIndex41)<->v4569(VarCurr,bitIndex533))& (v4697(VarNext,bitIndex40)<->v4569(VarCurr,bitIndex532))& (v4697(VarNext,bitIndex39)<->v4569(VarCurr,bitIndex531))& (v4697(VarNext,bitIndex38)<->v4569(VarCurr,bitIndex530))& (v4697(VarNext,bitIndex37)<->v4569(VarCurr,bitIndex529))& (v4697(VarNext,bitIndex36)<->v4569(VarCurr,bitIndex528))& (v4697(VarNext,bitIndex35)<->v4569(VarCurr,bitIndex527))& (v4697(VarNext,bitIndex34)<->v4569(VarCurr,bitIndex526))& (v4697(VarNext,bitIndex33)<->v4569(VarCurr,bitIndex525))& (v4697(VarNext,bitIndex32)<->v4569(VarCurr,bitIndex524))& (v4697(VarNext,bitIndex31)<->v4569(VarCurr,bitIndex523))& (v4697(VarNext,bitIndex30)<->v4569(VarCurr,bitIndex522))& (v4697(VarNext,bitIndex29)<->v4569(VarCurr,bitIndex521))& (v4697(VarNext,bitIndex28)<->v4569(VarCurr,bitIndex520))& (v4697(VarNext,bitIndex27)<->v4569(VarCurr,bitIndex519))& (v4697(VarNext,bitIndex26)<->v4569(VarCurr,bitIndex518))& (v4697(VarNext,bitIndex25)<->v4569(VarCurr,bitIndex517))& (v4697(VarNext,bitIndex24)<->v4569(VarCurr,bitIndex516))& (v4697(VarNext,bitIndex23)<->v4569(VarCurr,bitIndex515))& (v4697(VarNext,bitIndex22)<->v4569(VarCurr,bitIndex514))& (v4697(VarNext,bitIndex21)<->v4569(VarCurr,bitIndex513))& (v4697(VarNext,bitIndex20)<->v4569(VarCurr,bitIndex512))& (v4697(VarNext,bitIndex19)<->v4569(VarCurr,bitIndex511))& (v4697(VarNext,bitIndex18)<->v4569(VarCurr,bitIndex510))& (v4697(VarNext,bitIndex17)<->v4569(VarCurr,bitIndex509))& (v4697(VarNext,bitIndex16)<->v4569(VarCurr,bitIndex508))& (v4697(VarNext,bitIndex15)<->v4569(VarCurr,bitIndex507))& (v4697(VarNext,bitIndex14)<->v4569(VarCurr,bitIndex506))& (v4697(VarNext,bitIndex13)<->v4569(VarCurr,bitIndex505))& (v4697(VarNext,bitIndex12)<->v4569(VarCurr,bitIndex504))& (v4697(VarNext,bitIndex11)<->v4569(VarCurr,bitIndex503))& (v4697(VarNext,bitIndex10)<->v4569(VarCurr,bitIndex502))& (v4697(VarNext,bitIndex9)<->v4569(VarCurr,bitIndex501))& (v4697(VarNext,bitIndex8)<->v4569(VarCurr,bitIndex500))& (v4697(VarNext,bitIndex7)<->v4569(VarCurr,bitIndex499))& (v4697(VarNext,bitIndex6)<->v4569(VarCurr,bitIndex498))& (v4697(VarNext,bitIndex5)<->v4569(VarCurr,bitIndex497))& (v4697(VarNext,bitIndex4)<->v4569(VarCurr,bitIndex496))& (v4697(VarNext,bitIndex3)<->v4569(VarCurr,bitIndex495))& (v4697(VarNext,bitIndex2)<->v4569(VarCurr,bitIndex494))& (v4697(VarNext,bitIndex1)<->v4569(VarCurr,bitIndex493))& (v4697(VarNext,bitIndex0)<->v4569(VarCurr,bitIndex492)))).
% 121.33/120.37  all VarNext (v4698(VarNext)-> (all B (range_122_0(B)-> (v4697(VarNext,B)<->v4725(VarNext,B))))).
% 121.33/120.37  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_122_0(B)-> (v4725(VarNext,B)<->v4723(VarCurr,B))))).
% 121.33/120.37  all VarCurr (-v4708(VarCurr)-> (all B (range_122_0(B)-> (v4723(VarCurr,B)<->v4726(VarCurr,B))))).
% 121.33/120.37  all VarCurr (v4708(VarCurr)-> (all B (range_122_0(B)-> (v4723(VarCurr,B)<->$F)))).
% 121.33/120.37  all VarCurr (-v4712(VarCurr)& -v4714(VarCurr)-> (all B (range_122_0(B)-> (v4726(VarCurr,B)<->v4690(VarCurr,B))))).
% 121.33/120.37  all VarCurr (v4714(VarCurr)-> (all B (range_122_0(B)-> (v4726(VarCurr,B)<->v4572(VarCurr,B))))).
% 121.33/120.37  all VarCurr (v4712(VarCurr)-> (v4726(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex491))& (v4726(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex490))& (v4726(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex489))& (v4726(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex488))& (v4726(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex487))& (v4726(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex486))& (v4726(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex485))& (v4726(VarCurr,bitIndex115)<->v4569(VarCurr,bitIndex484))& (v4726(VarCurr,bitIndex114)<->v4569(VarCurr,bitIndex483))& (v4726(VarCurr,bitIndex113)<->v4569(VarCurr,bitIndex482))& (v4726(VarCurr,bitIndex112)<->v4569(VarCurr,bitIndex481))& (v4726(VarCurr,bitIndex111)<->v4569(VarCurr,bitIndex480))& (v4726(VarCurr,bitIndex110)<->v4569(VarCurr,bitIndex479))& (v4726(VarCurr,bitIndex109)<->v4569(VarCurr,bitIndex478))& (v4726(VarCurr,bitIndex108)<->v4569(VarCurr,bitIndex477))& (v4726(VarCurr,bitIndex107)<->v4569(VarCurr,bitIndex476))& (v4726(VarCurr,bitIndex106)<->v4569(VarCurr,bitIndex475))& (v4726(VarCurr,bitIndex105)<->v4569(VarCurr,bitIndex474))& (v4726(VarCurr,bitIndex104)<->v4569(VarCurr,bitIndex473))& (v4726(VarCurr,bitIndex103)<->v4569(VarCurr,bitIndex472))& (v4726(VarCurr,bitIndex102)<->v4569(VarCurr,bitIndex471))& (v4726(VarCurr,bitIndex101)<->v4569(VarCurr,bitIndex470))& (v4726(VarCurr,bitIndex100)<->v4569(VarCurr,bitIndex469))& (v4726(VarCurr,bitIndex99)<->v4569(VarCurr,bitIndex468))& (v4726(VarCurr,bitIndex98)<->v4569(VarCurr,bitIndex467))& (v4726(VarCurr,bitIndex97)<->v4569(VarCurr,bitIndex466))& (v4726(VarCurr,bitIndex96)<->v4569(VarCurr,bitIndex465))& (v4726(VarCurr,bitIndex95)<->v4569(VarCurr,bitIndex464))& (v4726(VarCurr,bitIndex94)<->v4569(VarCurr,bitIndex463))& (v4726(VarCurr,bitIndex93)<->v4569(VarCurr,bitIndex462))& (v4726(VarCurr,bitIndex92)<->v4569(VarCurr,bitIndex461))& (v4726(VarCurr,bitIndex91)<->v4569(VarCurr,bitIndex460))& (v4726(VarCurr,bitIndex90)<->v4569(VarCurr,bitIndex459))& (v4726(VarCurr,bitIndex89)<->v4569(VarCurr,bitIndex458))& (v4726(VarCurr,bitIndex88)<->v4569(VarCurr,bitIndex457))& (v4726(VarCurr,bitIndex87)<->v4569(VarCurr,bitIndex456))& (v4726(VarCurr,bitIndex86)<->v4569(VarCurr,bitIndex455))& (v4726(VarCurr,bitIndex85)<->v4569(VarCurr,bitIndex454))& (v4726(VarCurr,bitIndex84)<->v4569(VarCurr,bitIndex453))& (v4726(VarCurr,bitIndex83)<->v4569(VarCurr,bitIndex452))& (v4726(VarCurr,bitIndex82)<->v4569(VarCurr,bitIndex451))& (v4726(VarCurr,bitIndex81)<->v4569(VarCurr,bitIndex450))& (v4726(VarCurr,bitIndex80)<->v4569(VarCurr,bitIndex449))& (v4726(VarCurr,bitIndex79)<->v4569(VarCurr,bitIndex448))& (v4726(VarCurr,bitIndex78)<->v4569(VarCurr,bitIndex447))& (v4726(VarCurr,bitIndex77)<->v4569(VarCurr,bitIndex446))& (v4726(VarCurr,bitIndex76)<->v4569(VarCurr,bitIndex445))& (v4726(VarCurr,bitIndex75)<->v4569(VarCurr,bitIndex444))& (v4726(VarCurr,bitIndex74)<->v4569(VarCurr,bitIndex443))& (v4726(VarCurr,bitIndex73)<->v4569(VarCurr,bitIndex442))& (v4726(VarCurr,bitIndex72)<->v4569(VarCurr,bitIndex441))& (v4726(VarCurr,bitIndex71)<->v4569(VarCurr,bitIndex440))& (v4726(VarCurr,bitIndex70)<->v4569(VarCurr,bitIndex439))& (v4726(VarCurr,bitIndex69)<->v4569(VarCurr,bitIndex438))& (v4726(VarCurr,bitIndex68)<->v4569(VarCurr,bitIndex437))& (v4726(VarCurr,bitIndex67)<->v4569(VarCurr,bitIndex436))& (v4726(VarCurr,bitIndex66)<->v4569(VarCurr,bitIndex435))& (v4726(VarCurr,bitIndex65)<->v4569(VarCurr,bitIndex434))& (v4726(VarCurr,bitIndex64)<->v4569(VarCurr,bitIndex433))& (v4726(VarCurr,bitIndex63)<->v4569(VarCurr,bitIndex432))& (v4726(VarCurr,bitIndex62)<->v4569(VarCurr,bitIndex431))& (v4726(VarCurr,bitIndex61)<->v4569(VarCurr,bitIndex430))& (v4726(VarCurr,bitIndex60)<->v4569(VarCurr,bitIndex429))& (v4726(VarCurr,bitIndex59)<->v4569(VarCurr,bitIndex428))& (v4726(VarCurr,bitIndex58)<->v4569(VarCurr,bitIndex427))& (v4726(VarCurr,bitIndex57)<->v4569(VarCurr,bitIndex426))& (v4726(VarCurr,bitIndex56)<->v4569(VarCurr,bitIndex425))& (v4726(VarCurr,bitIndex55)<->v4569(VarCurr,bitIndex424))& (v4726(VarCurr,bitIndex54)<->v4569(VarCurr,bitIndex423))& (v4726(VarCurr,bitIndex53)<->v4569(VarCurr,bitIndex422))& (v4726(VarCurr,bitIndex52)<->v4569(VarCurr,bitIndex421))& (v4726(VarCurr,bitIndex51)<->v4569(VarCurr,bitIndex420))& (v4726(VarCurr,bitIndex50)<->v4569(VarCurr,bitIndex419))& (v4726(VarCurr,bitIndex49)<->v4569(VarCurr,bitIndex418))& (v4726(VarCurr,bitIndex48)<->v4569(VarCurr,bitIndex417))& (v4726(VarCurr,bitIndex47)<->v4569(VarCurr,bitIndex416))& (v4726(VarCurr,bitIndex46)<->v4569(VarCurr,bitIndex415))& (v4726(VarCurr,bitIndex45)<->v4569(VarCurr,bitIndex414))& (v4726(VarCurr,bitIndex44)<->v4569(VarCurr,bitIndex413))& (v4726(VarCurr,bitIndex43)<->v4569(VarCurr,bitIndex412))& (v4726(VarCurr,bitIndex42)<->v4569(VarCurr,bitIndex411))& (v4726(VarCurr,bitIndex41)<->v4569(VarCurr,bitIndex410))& (v4726(VarCurr,bitIndex40)<->v4569(VarCurr,bitIndex409))& (v4726(VarCurr,bitIndex39)<->v4569(VarCurr,bitIndex408))& (v4726(VarCurr,bitIndex38)<->v4569(VarCurr,bitIndex407))& (v4726(VarCurr,bitIndex37)<->v4569(VarCurr,bitIndex406))& (v4726(VarCurr,bitIndex36)<->v4569(VarCurr,bitIndex405))& (v4726(VarCurr,bitIndex35)<->v4569(VarCurr,bitIndex404))& (v4726(VarCurr,bitIndex34)<->v4569(VarCurr,bitIndex403))& (v4726(VarCurr,bitIndex33)<->v4569(VarCurr,bitIndex402))& (v4726(VarCurr,bitIndex32)<->v4569(VarCurr,bitIndex401))& (v4726(VarCurr,bitIndex31)<->v4569(VarCurr,bitIndex400))& (v4726(VarCurr,bitIndex30)<->v4569(VarCurr,bitIndex399))& (v4726(VarCurr,bitIndex29)<->v4569(VarCurr,bitIndex398))& (v4726(VarCurr,bitIndex28)<->v4569(VarCurr,bitIndex397))& (v4726(VarCurr,bitIndex27)<->v4569(VarCurr,bitIndex396))& (v4726(VarCurr,bitIndex26)<->v4569(VarCurr,bitIndex395))& (v4726(VarCurr,bitIndex25)<->v4569(VarCurr,bitIndex394))& (v4726(VarCurr,bitIndex24)<->v4569(VarCurr,bitIndex393))& (v4726(VarCurr,bitIndex23)<->v4569(VarCurr,bitIndex392))& (v4726(VarCurr,bitIndex22)<->v4569(VarCurr,bitIndex391))& (v4726(VarCurr,bitIndex21)<->v4569(VarCurr,bitIndex390))& (v4726(VarCurr,bitIndex20)<->v4569(VarCurr,bitIndex389))& (v4726(VarCurr,bitIndex19)<->v4569(VarCurr,bitIndex388))& (v4726(VarCurr,bitIndex18)<->v4569(VarCurr,bitIndex387))& (v4726(VarCurr,bitIndex17)<->v4569(VarCurr,bitIndex386))& (v4726(VarCurr,bitIndex16)<->v4569(VarCurr,bitIndex385))& (v4726(VarCurr,bitIndex15)<->v4569(VarCurr,bitIndex384))& (v4726(VarCurr,bitIndex14)<->v4569(VarCurr,bitIndex383))& (v4726(VarCurr,bitIndex13)<->v4569(VarCurr,bitIndex382))& (v4726(VarCurr,bitIndex12)<->v4569(VarCurr,bitIndex381))& (v4726(VarCurr,bitIndex11)<->v4569(VarCurr,bitIndex380))& (v4726(VarCurr,bitIndex10)<->v4569(VarCurr,bitIndex379))& (v4726(VarCurr,bitIndex9)<->v4569(VarCurr,bitIndex378))& (v4726(VarCurr,bitIndex8)<->v4569(VarCurr,bitIndex377))& (v4726(VarCurr,bitIndex7)<->v4569(VarCurr,bitIndex376))& (v4726(VarCurr,bitIndex6)<->v4569(VarCurr,bitIndex375))& (v4726(VarCurr,bitIndex5)<->v4569(VarCurr,bitIndex374))& (v4726(VarCurr,bitIndex4)<->v4569(VarCurr,bitIndex373))& (v4726(VarCurr,bitIndex3)<->v4569(VarCurr,bitIndex372))& (v4726(VarCurr,bitIndex2)<->v4569(VarCurr,bitIndex371))& (v4726(VarCurr,bitIndex1)<->v4569(VarCurr,bitIndex370))& (v4726(VarCurr,bitIndex0)<->v4569(VarCurr,bitIndex369))).
% 121.33/120.37  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4698(VarNext)<->v4699(VarNext)&v4706(VarNext))).
% 121.33/120.37  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4706(VarNext)<->v4704(VarCurr))).
% 121.33/120.37  all VarCurr (v4704(VarCurr)<->v4707(VarCurr)&v4719(VarCurr)).
% 121.33/120.37  all VarCurr (v4719(VarCurr)<->v4720(VarCurr)|v4708(VarCurr)).
% 121.33/120.37  all VarCurr (-v4720(VarCurr)<->v4721(VarCurr)).
% 121.33/120.37  all VarCurr (v4721(VarCurr)<-> (v4722(VarCurr,bitIndex1)<->$F)& (v4722(VarCurr,bitIndex0)<->$F)).
% 121.33/120.37  all VarCurr (v4722(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.37  all VarCurr (v4722(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.38  all VarCurr (v4707(VarCurr)<->v4708(VarCurr)|v4709(VarCurr)).
% 121.33/120.38  all VarCurr (v4709(VarCurr)<->v4710(VarCurr)&v4718(VarCurr)).
% 121.33/120.38  all VarCurr (-v4718(VarCurr)<->v4708(VarCurr)).
% 121.33/120.38  all VarCurr (v4710(VarCurr)<->v4711(VarCurr)|v4716(VarCurr)).
% 121.33/120.38  all VarCurr (v4716(VarCurr)<-> (v4717(VarCurr,bitIndex1)<->$T)& (v4717(VarCurr,bitIndex0)<->$T)).
% 121.33/120.38  all VarCurr (v4717(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.38  all VarCurr (v4717(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.38  all VarCurr (v4711(VarCurr)<->v4712(VarCurr)|v4714(VarCurr)).
% 121.33/120.38  all VarCurr (v4714(VarCurr)<-> (v4715(VarCurr,bitIndex1)<->$T)& (v4715(VarCurr,bitIndex0)<->$F)).
% 121.33/120.38  all VarCurr (v4715(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.38  all VarCurr (v4715(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.38  all VarCurr (v4712(VarCurr)<-> (v4713(VarCurr,bitIndex1)<->$F)& (v4713(VarCurr,bitIndex0)<->$T)).
% 121.33/120.38  all VarCurr (v4713(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.38  all VarCurr (v4713(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.38  all VarCurr (-v4708(VarCurr)<->v43(VarCurr)).
% 121.33/120.38  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4699(VarNext)<->v4700(VarNext)&v4515(VarNext))).
% 121.33/120.38  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4700(VarNext)<->v4522(VarNext))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4690(VarCurr,B)<->v4695(VarCurr,B))).
% 121.33/120.38  all VarCurr (-v4692(VarCurr)-> (all B (range_122_0(B)-> (v4695(VarCurr,B)<->v4694(VarCurr,B))))).
% 121.33/120.38  all VarCurr (v4692(VarCurr)-> (all B (range_122_0(B)-> (v4695(VarCurr,B)<->v4624(VarCurr,B))))).
% 121.33/120.38  all VarCurr ((v4694(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex491))& (v4694(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex490))& (v4694(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex489))& (v4694(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex488))& (v4694(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex487))& (v4694(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex486))& (v4694(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex485))).
% 121.33/120.38  all VarCurr (v4692(VarCurr)<->v4576(VarCurr,bitIndex2)).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4572(VarCurr,B)<->v4688(VarCurr,B))).
% 121.33/120.38  all VarCurr (-v4574(VarCurr)-> (all B (range_122_0(B)-> (v4688(VarCurr,B)<->v4681(VarCurr,B))))).
% 121.33/120.38  all VarCurr (v4574(VarCurr)-> (all B (range_122_0(B)-> (v4688(VarCurr,B)<->v4624(VarCurr,B))))).
% 121.33/120.38  all VarCurr ((v4681(VarCurr,bitIndex122)<->v4569(VarCurr,bitIndex614))& (v4681(VarCurr,bitIndex121)<->v4569(VarCurr,bitIndex613))& (v4681(VarCurr,bitIndex120)<->v4569(VarCurr,bitIndex612))& (v4681(VarCurr,bitIndex119)<->v4569(VarCurr,bitIndex611))& (v4681(VarCurr,bitIndex118)<->v4569(VarCurr,bitIndex610))& (v4681(VarCurr,bitIndex117)<->v4569(VarCurr,bitIndex609))& (v4681(VarCurr,bitIndex116)<->v4569(VarCurr,bitIndex608))).
% 121.33/120.38  -v4569(constB0,bitIndex737).
% 121.33/120.38  -v4569(constB0,bitIndex736).
% 121.33/120.38  -v4569(constB0,bitIndex735).
% 121.33/120.38  -v4569(constB0,bitIndex734).
% 121.33/120.38  -v4569(constB0,bitIndex733).
% 121.33/120.38  -v4569(constB0,bitIndex732).
% 121.33/120.38  -v4569(constB0,bitIndex731).
% 121.33/120.38  -v4569(constB0,bitIndex696).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex122).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex121).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex120).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex119).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex118).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex117).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex116).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex81).
% 121.33/120.38  -v4569(constB0,bitIndex614).
% 121.33/120.38  -v4569(constB0,bitIndex613).
% 121.33/120.38  -v4569(constB0,bitIndex612).
% 121.33/120.38  -v4569(constB0,bitIndex611).
% 121.33/120.38  -v4569(constB0,bitIndex610).
% 121.33/120.38  -v4569(constB0,bitIndex609).
% 121.33/120.38  -v4569(constB0,bitIndex608).
% 121.33/120.38  -v4569(constB0,bitIndex573).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex122).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex121).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex120).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex119).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex118).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex117).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex116).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex81).
% 121.33/120.38  -v4569(constB0,bitIndex491).
% 121.33/120.38  -v4569(constB0,bitIndex490).
% 121.33/120.38  -v4569(constB0,bitIndex489).
% 121.33/120.38  -v4569(constB0,bitIndex488).
% 121.33/120.38  -v4569(constB0,bitIndex487).
% 121.33/120.38  -v4569(constB0,bitIndex486).
% 121.33/120.38  -v4569(constB0,bitIndex485).
% 121.33/120.38  -v4569(constB0,bitIndex450).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex122).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex121).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex120).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex119).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex118).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex117).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex116).
% 121.33/120.38  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex81).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4624(VarCurr,B)<->v4626(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4626(VarCurr,B)<->v4628(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4628(VarCurr,B)<->v4630(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4630(VarCurr,B)<->v4632(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4632(VarCurr,B)<->v4634(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4634(VarCurr,B)<->v4636(VarCurr,B))).
% 121.33/120.38  all VarCurr B (range_122_116(B)-> (v4636(VarCurr,B)<->v4638(VarCurr,B))).
% 121.33/120.38  all VarNext B (range_122_116(B)-> (v4638(VarNext,B)<->v4665(VarNext,B))).
% 121.33/120.38  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4666(VarNext)-> (all B (range_122_0(B)-> (v4665(VarNext,B)<->v4638(VarCurr,B)))))).
% 121.33/120.38  all VarNext (v4666(VarNext)-> (all B (range_122_0(B)-> (v4665(VarNext,B)<->v4676(VarNext,B))))).
% 121.33/120.38  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_122_0(B)-> (v4676(VarNext,B)<->v4674(VarCurr,B))))).
% 121.33/120.38  all VarCurr (-v4677(VarCurr)-> (all B (range_122_0(B)-> (v4674(VarCurr,B)<->v4642(VarCurr,B))))).
% 121.33/120.38  all VarCurr (v4677(VarCurr)-> (all B (range_122_0(B)-> (v4674(VarCurr,B)<->$F)))).
% 121.33/120.38  all B (range_122_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B|bitIndex67=B|bitIndex68=B|bitIndex69=B|bitIndex70=B|bitIndex71=B|bitIndex72=B|bitIndex73=B|bitIndex74=B|bitIndex75=B|bitIndex76=B|bitIndex77=B|bitIndex78=B|bitIndex79=B|bitIndex80=B|bitIndex81=B|bitIndex82=B|bitIndex83=B|bitIndex84=B|bitIndex85=B|bitIndex86=B|bitIndex87=B|bitIndex88=B|bitIndex89=B|bitIndex90=B|bitIndex91=B|bitIndex92=B|bitIndex93=B|bitIndex94=B|bitIndex95=B|bitIndex96=B|bitIndex97=B|bitIndex98=B|bitIndex99=B|bitIndex100=B|bitIndex101=B|bitIndex102=B|bitIndex103=B|bitIndex104=B|bitIndex105=B|bitIndex106=B|bitIndex107=B|bitIndex108=B|bitIndex109=B|bitIndex110=B|bitIndex111=B|bitIndex112=B|bitIndex113=B|bitIndex114=B|bitIndex115=B|bitIndex116=B|bitIndex117=B|bitIndex118=B|bitIndex119=B|bitIndex120=B|bitIndex121=B|bitIndex122=B).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex122).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex121).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex120).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex119).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex118).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex117).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex116).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex115).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex114).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex113).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex112).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex111).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex110).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex109).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex108).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex107).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex106).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex105).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex104).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex103).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex102).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex101).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex100).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex99).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex98).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex97).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex96).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex95).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex94).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex93).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex92).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex91).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex90).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex89).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex88).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex87).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex86).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex85).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex84).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex83).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex82).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex81).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex80).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex79).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex78).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex77).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex76).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex75).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex74).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex73).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex72).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex71).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex70).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex69).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex68).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex67).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex66).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex65).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex64).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 121.33/120.38  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 121.33/120.39  -b000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 121.33/120.39  all VarCurr (-v4677(VarCurr)<->v4640(VarCurr)).
% 121.33/120.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4666(VarNext)<->v4667(VarNext))).
% 121.33/120.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4667(VarNext)<->v4668(VarNext)&v4663(VarNext))).
% 121.33/120.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4668(VarNext)<->v4670(VarNext))).
% 121.33/120.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4670(VarNext)<->v4663(VarCurr))).
% 121.33/120.39  all VarCurr (v4663(VarCurr)<->v534(VarCurr)).
% 121.33/120.39  all VarCurr B (range_122_116(B)-> (v4642(VarCurr,B)<->v4652(VarCurr,B))).
% 121.33/120.39  all B (range_122_116(B)<->bitIndex116=B|bitIndex117=B|bitIndex118=B|bitIndex119=B|bitIndex120=B|bitIndex121=B|bitIndex122=B).
% 121.33/120.39  all VarCurr B (range_6_0(B)-> (v4652(VarCurr,B)<->v4661(VarCurr,B))).
% 121.33/120.39  all VarCurr ((v4652(VarCurr,bitIndex68)<->v4660(VarCurr,bitIndex61))& (v4652(VarCurr,bitIndex67)<->v4660(VarCurr,bitIndex60))& (v4652(VarCurr,bitIndex66)<->v4660(VarCurr,bitIndex59))& (v4652(VarCurr,bitIndex65)<->v4660(VarCurr,bitIndex58))& (v4652(VarCurr,bitIndex64)<->v4660(VarCurr,bitIndex57))& (v4652(VarCurr,bitIndex63)<->v4660(VarCurr,bitIndex56))& (v4652(VarCurr,bitIndex62)<->v4660(VarCurr,bitIndex55))& (v4652(VarCurr,bitIndex61)<->v4660(VarCurr,bitIndex54))& (v4652(VarCurr,bitIndex60)<->v4660(VarCurr,bitIndex53))& (v4652(VarCurr,bitIndex59)<->v4660(VarCurr,bitIndex52))& (v4652(VarCurr,bitIndex58)<->v4660(VarCurr,bitIndex51))& (v4652(VarCurr,bitIndex57)<->v4660(VarCurr,bitIndex50))& (v4652(VarCurr,bitIndex56)<->v4660(VarCurr,bitIndex49))& (v4652(VarCurr,bitIndex55)<->v4660(VarCurr,bitIndex48))& (v4652(VarCurr,bitIndex54)<->v4660(VarCurr,bitIndex47))& (v4652(VarCurr,bitIndex53)<->v4660(VarCurr,bitIndex46))& (v4652(VarCurr,bitIndex52)<->v4660(VarCurr,bitIndex45))& (v4652(VarCurr,bitIndex51)<->v4660(VarCurr,bitIndex44))& (v4652(VarCurr,bitIndex50)<->v4660(VarCurr,bitIndex43))& (v4652(VarCurr,bitIndex49)<->v4660(VarCurr,bitIndex42))& (v4652(VarCurr,bitIndex48)<->v4660(VarCurr,bitIndex41))& (v4652(VarCurr,bitIndex47)<->v4660(VarCurr,bitIndex40))& (v4652(VarCurr,bitIndex46)<->v4660(VarCurr,bitIndex39))& (v4652(VarCurr,bitIndex45)<->v4660(VarCurr,bitIndex38))& (v4652(VarCurr,bitIndex44)<->v4660(VarCurr,bitIndex37))& (v4652(VarCurr,bitIndex43)<->v4660(VarCurr,bitIndex36))& (v4652(VarCurr,bitIndex42)<->v4660(VarCurr,bitIndex35))& (v4652(VarCurr,bitIndex41)<->v4660(VarCurr,bitIndex34))& (v4652(VarCurr,bitIndex40)<->v4660(VarCurr,bitIndex33))& (v4652(VarCurr,bitIndex39)<->v4660(VarCurr,bitIndex32))& (v4652(VarCurr,bitIndex38)<->v4660(VarCurr,bitIndex31))& (v4652(VarCurr,bitIndex37)<->v4660(VarCurr,bitIndex30))& (v4652(VarCurr,bitIndex36)<->v4660(VarCurr,bitIndex29))& (v4652(VarCurr,bitIndex35)<->v4660(VarCurr,bitIndex28))& (v4652(VarCurr,bitIndex34)<->v4660(VarCurr,bitIndex27))& (v4652(VarCurr,bitIndex33)<->v4660(VarCurr,bitIndex26))& (v4652(VarCurr,bitIndex32)<->v4660(VarCurr,bitIndex25))& (v4652(VarCurr,bitIndex31)<->v4660(VarCurr,bitIndex24))& (v4652(VarCurr,bitIndex30)<->v4660(VarCurr,bitIndex23))& (v4652(VarCurr,bitIndex29)<->v4660(VarCurr,bitIndex22))& (v4652(VarCurr,bitIndex28)<->v4660(VarCurr,bitIndex21))& (v4652(VarCurr,bitIndex27)<->v4660(VarCurr,bitIndex20))& (v4652(VarCurr,bitIndex26)<->v4660(VarCurr,bitIndex19))& (v4652(VarCurr,bitIndex25)<->v4660(VarCurr,bitIndex18))& (v4652(VarCurr,bitIndex24)<->v4660(VarCurr,bitIndex17))& (v4652(VarCurr,bitIndex23)<->v4660(VarCurr,bitIndex16))& (v4652(VarCurr,bitIndex22)<->v4660(VarCurr,bitIndex15))& (v4652(VarCurr,bitIndex21)<->v4660(VarCurr,bitIndex14))& (v4652(VarCurr,bitIndex20)<->v4660(VarCurr,bitIndex13))& (v4652(VarCurr,bitIndex19)<->v4660(VarCurr,bitIndex12))& (v4652(VarCurr,bitIndex18)<->v4660(VarCurr,bitIndex11))& (v4652(VarCurr,bitIndex17)<->v4660(VarCurr,bitIndex10))& (v4652(VarCurr,bitIndex16)<->v4660(VarCurr,bitIndex9))& (v4652(VarCurr,bitIndex15)<->v4660(VarCurr,bitIndex8))& (v4652(VarCurr,bitIndex14)<->v4660(VarCurr,bitIndex7))& (v4652(VarCurr,bitIndex13)<->v4660(VarCurr,bitIndex6))& (v4652(VarCurr,bitIndex12)<->v4660(VarCurr,bitIndex5))& (v4652(VarCurr,bitIndex11)<->v4660(VarCurr,bitIndex4))& (v4652(VarCurr,bitIndex10)<->v4660(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex9)<->v4660(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex8)<->v4660(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex7)<->v4660(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex72)<->v4659(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex71)<->v4659(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex70)<->v4659(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex69)<->v4659(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex76)<->v4658(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex75)<->v4658(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex74)<->v4658(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex73)<->v4658(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex84)<->v4657(VarCurr,bitIndex7))& (v4652(VarCurr,bitIndex83)<->v4657(VarCurr,bitIndex6))& (v4652(VarCurr,bitIndex82)<->v4657(VarCurr,bitIndex5))& (v4652(VarCurr,bitIndex81)<->v4657(VarCurr,bitIndex4))& (v4652(VarCurr,bitIndex80)<->v4657(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex79)<->v4657(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex78)<->v4657(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex77)<->v4657(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex100)<->v4656(VarCurr,bitIndex15))& (v4652(VarCurr,bitIndex99)<->v4656(VarCurr,bitIndex14))& (v4652(VarCurr,bitIndex98)<->v4656(VarCurr,bitIndex13))& (v4652(VarCurr,bitIndex97)<->v4656(VarCurr,bitIndex12))& (v4652(VarCurr,bitIndex96)<->v4656(VarCurr,bitIndex11))& (v4652(VarCurr,bitIndex95)<->v4656(VarCurr,bitIndex10))& (v4652(VarCurr,bitIndex94)<->v4656(VarCurr,bitIndex9))& (v4652(VarCurr,bitIndex93)<->v4656(VarCurr,bitIndex8))& (v4652(VarCurr,bitIndex92)<->v4656(VarCurr,bitIndex7))& (v4652(VarCurr,bitIndex91)<->v4656(VarCurr,bitIndex6))& (v4652(VarCurr,bitIndex90)<->v4656(VarCurr,bitIndex5))& (v4652(VarCurr,bitIndex89)<->v4656(VarCurr,bitIndex4))& (v4652(VarCurr,bitIndex88)<->v4656(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex87)<->v4656(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex86)<->v4656(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex85)<->v4656(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex110)<->v4655(VarCurr,bitIndex9))& (v4652(VarCurr,bitIndex109)<->v4655(VarCurr,bitIndex8))& (v4652(VarCurr,bitIndex108)<->v4655(VarCurr,bitIndex7))& (v4652(VarCurr,bitIndex107)<->v4655(VarCurr,bitIndex6))& (v4652(VarCurr,bitIndex106)<->v4655(VarCurr,bitIndex5))& (v4652(VarCurr,bitIndex105)<->v4655(VarCurr,bitIndex4))& (v4652(VarCurr,bitIndex104)<->v4655(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex103)<->v4655(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex102)<->v4655(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex101)<->v4655(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex112)<->v4654(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex111)<->v4654(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex115)<->v4653(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex114)<->v4653(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex113)<->v4653(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr ((v4652(VarCurr,bitIndex122)<->v4644(VarCurr,bitIndex6))& (v4652(VarCurr,bitIndex121)<->v4644(VarCurr,bitIndex5))& (v4652(VarCurr,bitIndex120)<->v4644(VarCurr,bitIndex4))& (v4652(VarCurr,bitIndex119)<->v4644(VarCurr,bitIndex3))& (v4652(VarCurr,bitIndex118)<->v4644(VarCurr,bitIndex2))& (v4652(VarCurr,bitIndex117)<->v4644(VarCurr,bitIndex1))& (v4652(VarCurr,bitIndex116)<->v4644(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr (-v4648(VarCurr)-> (all B (range_6_0(B)-> (v4644(VarCurr,B)<->v170(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4648(VarCurr)-> (all B (range_6_0(B)-> (v4644(VarCurr,B)<->v4651(VarCurr,B))))).
% 121.33/120.40  all VarCurr (-v4649(VarCurr)-> (all B (range_6_0(B)-> (v4651(VarCurr,B)<->b1111000(B))))).
% 121.33/120.40  b1111000(bitIndex6).
% 121.33/120.40  b1111000(bitIndex5).
% 121.33/120.40  b1111000(bitIndex4).
% 121.33/120.40  b1111000(bitIndex3).
% 121.33/120.40  -b1111000(bitIndex2).
% 121.33/120.40  -b1111000(bitIndex1).
% 121.33/120.40  -b1111000(bitIndex0).
% 121.33/120.40  all VarCurr (v4649(VarCurr)-> (all B (range_6_0(B)-> (v4651(VarCurr,B)<->b1011000(B))))).
% 121.33/120.40  b1011000(bitIndex6).
% 121.33/120.40  -b1011000(bitIndex5).
% 121.33/120.40  b1011000(bitIndex4).
% 121.33/120.40  b1011000(bitIndex3).
% 121.33/120.40  -b1011000(bitIndex2).
% 121.33/120.40  -b1011000(bitIndex1).
% 121.33/120.40  -b1011000(bitIndex0).
% 121.33/120.40  all VarCurr (v4648(VarCurr)<->v4649(VarCurr)|v4650(VarCurr)).
% 121.33/120.40  all VarCurr (v4650(VarCurr)<->v173(VarCurr)&v370(VarCurr)).
% 121.33/120.40  all VarCurr (v4649(VarCurr)<->v101(VarCurr)&v355(VarCurr)).
% 121.33/120.40  all VarCurr (v4640(VarCurr)<->v67(VarCurr)).
% 121.33/120.40  all VarCurr (v4574(VarCurr)<->v4576(VarCurr,bitIndex2)).
% 121.33/120.40  all VarCurr (v4576(VarCurr,bitIndex2)<->v4591(VarCurr,bitIndex2)).
% 121.33/120.40  all VarNext (v4535(VarNext,bitIndex1)<->v4616(VarNext,bitIndex1)).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4617(VarNext)-> (all B (range_8_0(B)-> (v4616(VarNext,B)<->v4535(VarCurr,B)))))).
% 121.33/120.40  all VarNext (v4617(VarNext)-> (all B (range_8_0(B)-> (v4616(VarNext,B)<->v4588(VarNext,B))))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4617(VarNext)<->v4618(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4618(VarNext)<->v4620(VarNext)&v4515(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4620(VarNext)<->v4522(VarNext))).
% 121.33/120.40  all VarCurr (v4576(VarCurr,bitIndex1)<->v4591(VarCurr,bitIndex1)).
% 121.33/120.40  all VarNext (v4535(VarNext,bitIndex0)<->v4608(VarNext,bitIndex0)).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4609(VarNext)-> (all B (range_8_0(B)-> (v4608(VarNext,B)<->v4535(VarCurr,B)))))).
% 121.33/120.40  all VarNext (v4609(VarNext)-> (all B (range_8_0(B)-> (v4608(VarNext,B)<->v4588(VarNext,B))))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4609(VarNext)<->v4610(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4610(VarNext)<->v4612(VarNext)&v4515(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4612(VarNext)<->v4522(VarNext))).
% 121.33/120.40  all VarCurr (v4576(VarCurr,bitIndex0)<->v4591(VarCurr,bitIndex0)).
% 121.33/120.40  all VarCurr (-v4592(VarCurr)-> (all B (range_8_0(B)-> (v4591(VarCurr,B)<->v4594(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4592(VarCurr)-> (all B (range_8_0(B)-> (v4591(VarCurr,B)<->v4593(VarCurr,B))))).
% 121.33/120.40  all VarCurr (-v4595(VarCurr)& -v4597(VarCurr)& -v4601(VarCurr)-> (all B (range_8_0(B)-> (v4594(VarCurr,B)<->v4535(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4601(VarCurr)-> (all B (range_8_0(B)-> (v4594(VarCurr,B)<->v4603(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4597(VarCurr)-> (all B (range_8_0(B)-> (v4594(VarCurr,B)<->v4599(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4595(VarCurr)-> (all B (range_8_0(B)-> (v4594(VarCurr,B)<->v4535(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4605(VarCurr)<-> (v4606(VarCurr,bitIndex1)<->$T)& (v4606(VarCurr,bitIndex0)<->$T)).
% 121.33/120.40  all VarCurr (v4606(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.40  all VarCurr (v4606(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.40  all VarCurr (v4603(VarCurr,bitIndex0)<->$F).
% 121.33/120.40  all VarCurr ((v4603(VarCurr,bitIndex8)<->v4535(VarCurr,bitIndex7))& (v4603(VarCurr,bitIndex7)<->v4535(VarCurr,bitIndex6))& (v4603(VarCurr,bitIndex6)<->v4535(VarCurr,bitIndex5))& (v4603(VarCurr,bitIndex5)<->v4535(VarCurr,bitIndex4))& (v4603(VarCurr,bitIndex4)<->v4535(VarCurr,bitIndex3))& (v4603(VarCurr,bitIndex3)<->v4535(VarCurr,bitIndex2))& (v4603(VarCurr,bitIndex2)<->v4535(VarCurr,bitIndex1))& (v4603(VarCurr,bitIndex1)<->v4535(VarCurr,bitIndex0))).
% 121.33/120.40  all VarCurr (v4601(VarCurr)<-> (v4602(VarCurr,bitIndex1)<->$T)& (v4602(VarCurr,bitIndex0)<->$F)).
% 121.33/120.40  all VarCurr (v4602(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.40  all VarCurr (v4602(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.40  all VarCurr ((v4599(VarCurr,bitIndex7)<->v4535(VarCurr,bitIndex8))& (v4599(VarCurr,bitIndex6)<->v4535(VarCurr,bitIndex7))& (v4599(VarCurr,bitIndex5)<->v4535(VarCurr,bitIndex6))& (v4599(VarCurr,bitIndex4)<->v4535(VarCurr,bitIndex5))& (v4599(VarCurr,bitIndex3)<->v4535(VarCurr,bitIndex4))& (v4599(VarCurr,bitIndex2)<->v4535(VarCurr,bitIndex3))& (v4599(VarCurr,bitIndex1)<->v4535(VarCurr,bitIndex2))& (v4599(VarCurr,bitIndex0)<->v4535(VarCurr,bitIndex1))).
% 121.33/120.40  all VarCurr (v4599(VarCurr,bitIndex8)<->$F).
% 121.33/120.40  all VarCurr (v4597(VarCurr)<-> (v4598(VarCurr,bitIndex1)<->$F)& (v4598(VarCurr,bitIndex0)<->$T)).
% 121.33/120.40  all VarCurr (v4598(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.40  all VarCurr (v4598(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.40  all VarCurr (v4595(VarCurr)<-> (v4596(VarCurr,bitIndex1)<->$F)& (v4596(VarCurr,bitIndex0)<->$F)).
% 121.33/120.40  all VarCurr (v4596(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.33/120.40  all VarCurr (v4596(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.33/120.40  all VarCurr (v4593(VarCurr,bitIndex0)<->$T).
% 121.33/120.40  all VarCurr B (range_8_1(B)-> (v4593(VarCurr,B)<->v4533(VarCurr,B))).
% 121.33/120.40  all VarCurr (-v4592(VarCurr)<->v43(VarCurr)).
% 121.33/120.40  all VarCurr (v4533(VarCurr,bitIndex1)<->v4534(VarCurr,bitIndex1)).
% 121.33/120.40  all VarCurr (v4533(VarCurr,bitIndex2)<->v4534(VarCurr,bitIndex2)).
% 121.33/120.40  all VarNext (v4535(VarNext,bitIndex2)<->v4579(VarNext,bitIndex2)).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4580(VarNext)-> (all B (range_8_0(B)-> (v4579(VarNext,B)<->v4535(VarCurr,B)))))).
% 121.33/120.40  all VarNext (v4580(VarNext)-> (all B (range_8_0(B)-> (v4579(VarNext,B)<->v4588(VarNext,B))))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_8_0(B)-> (v4588(VarNext,B)<->v4586(VarCurr,B))))).
% 121.33/120.40  all VarCurr (-v4529(VarCurr)-> (all B (range_8_0(B)-> (v4586(VarCurr,B)<->v4576(VarCurr,B))))).
% 121.33/120.40  all VarCurr (v4529(VarCurr)-> (all B (range_8_0(B)-> (v4586(VarCurr,B)<->b000000001(B))))).
% 121.33/120.40  -b000000001(bitIndex8).
% 121.33/120.40  -b000000001(bitIndex7).
% 121.33/120.40  -b000000001(bitIndex6).
% 121.33/120.40  -b000000001(bitIndex5).
% 121.33/120.40  -b000000001(bitIndex4).
% 121.33/120.40  -b000000001(bitIndex3).
% 121.33/120.40  -b000000001(bitIndex2).
% 121.33/120.40  -b000000001(bitIndex1).
% 121.33/120.40  b000000001(bitIndex0).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4580(VarNext)<->v4581(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4581(VarNext)<->v4583(VarNext)&v4515(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4583(VarNext)<->v4522(VarNext))).
% 121.33/120.40  all VarCurr (v39(VarCurr)<->v41(VarCurr)).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4541(VarNext)-> (v41(VarNext)<->v41(VarCurr)))).
% 121.33/120.40  all VarNext (v4541(VarNext)-> (v41(VarNext)<->v4559(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4559(VarNext)<->v4557(VarCurr))).
% 121.33/120.40  all VarCurr (-v4556(VarCurr)-> (v4557(VarCurr)<->v4560(VarCurr))).
% 121.33/120.40  all VarCurr (v4556(VarCurr)-> (v4557(VarCurr)<->$T)).
% 121.33/120.40  all VarCurr (-v47(VarCurr)-> (v4560(VarCurr)<->$T)).
% 121.33/120.40  all VarCurr (v47(VarCurr)-> (v4560(VarCurr)<->$F)).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4541(VarNext)<->v4542(VarNext)&v4549(VarNext))).
% 121.33/120.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4549(VarNext)<->v4547(VarCurr))).
% 121.33/120.40  all VarCurr (v4547(VarCurr)<->v4550(VarCurr)|v4556(VarCurr)).
% 121.33/120.40  all VarCurr (-v4556(VarCurr)<->v43(VarCurr)).
% 121.33/120.40  all VarCurr (v4550(VarCurr)<->v4551(VarCurr)|v47(VarCurr)).
% 121.44/120.41  all VarCurr (v4551(VarCurr)<->v4552(VarCurr)&v4555(VarCurr)).
% 121.44/120.41  all VarCurr (v4555(VarCurr)<-> (v4533(VarCurr,bitIndex0)<->$T)).
% 121.44/120.41  all VarCurr (v4552(VarCurr)<->v4553(VarCurr)&v4554(VarCurr)).
% 121.44/120.41  all VarCurr (v4554(VarCurr)<-> (v480(VarCurr,bitIndex1)<->$F)).
% 121.44/120.41  all VarCurr (v4553(VarCurr)<-> (v554(VarCurr)<->$T)).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4542(VarNext)<->v4543(VarNext)&v4515(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4543(VarNext)<->v4522(VarNext))).
% 121.44/120.41  all VarCurr (v4533(VarCurr,bitIndex0)<->v4534(VarCurr,bitIndex0)).
% 121.44/120.41  all VarCurr (v4534(VarCurr,bitIndex0)<->$T).
% 121.44/120.41  all VarCurr B (range_8_1(B)-> (v4534(VarCurr,B)<->v4535(VarCurr,B))).
% 121.44/120.41  all B (range_8_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B).
% 121.44/120.41  -v4535(constB0,bitIndex3).
% 121.44/120.41  -v4535(constB0,bitIndex2).
% 121.44/120.41  -v4535(constB0,bitIndex1).
% 121.44/120.41  v4535(constB0,bitIndex0).
% 121.44/120.41  -bxxxxx0001(bitIndex3).
% 121.44/120.41  -bxxxxx0001(bitIndex2).
% 121.44/120.41  -bxxxxx0001(bitIndex1).
% 121.44/120.41  bxxxxx0001(bitIndex0).
% 121.44/120.41  all VarNext (v480(VarNext,bitIndex1)<->v4517(VarNext,bitIndex1)).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4518(VarNext)-> (all B (range_5_0(B)-> (v4517(VarNext,B)<->v480(VarCurr,B)))))).
% 121.44/120.41  all VarNext (v4518(VarNext)-> (all B (range_5_0(B)-> (v4517(VarNext,B)<->v4528(VarNext,B))))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_5_0(B)-> (v4528(VarNext,B)<->v4526(VarCurr,B))))).
% 121.44/120.41  all VarCurr (-v4529(VarCurr)-> (all B (range_5_0(B)-> (v4526(VarCurr,B)<->v4495(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4529(VarCurr)-> (all B (range_5_0(B)-> (v4526(VarCurr,B)<->$F)))).
% 121.44/120.41  all VarCurr (-v4529(VarCurr)<->v43(VarCurr)).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4518(VarNext)<->v4519(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4519(VarNext)<->v4520(VarNext)&v4515(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4520(VarNext)<->v4522(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4522(VarNext)<->v4515(VarCurr))).
% 121.44/120.41  all VarCurr (v4515(VarCurr)<->v4326(VarCurr)).
% 121.44/120.41  all VarCurr (v4495(VarCurr,bitIndex1)<->v4500(VarCurr,bitIndex1)).
% 121.44/120.41  all VarCurr (-v4501(VarCurr)-> (all B (range_5_0(B)-> (v4500(VarCurr,B)<->v4502(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4501(VarCurr)-> (all B (range_5_0(B)-> (v4500(VarCurr,B)<->$F)))).
% 121.44/120.41  all VarCurr (-v4503(VarCurr)& -v4505(VarCurr)& -v4509(VarCurr)-> (all B (range_5_0(B)-> (v4502(VarCurr,B)<->v480(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4509(VarCurr)-> (all B (range_5_0(B)-> (v4502(VarCurr,B)<->v4511(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4505(VarCurr)-> (all B (range_5_0(B)-> (v4502(VarCurr,B)<->v4507(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4503(VarCurr)-> (all B (range_5_0(B)-> (v4502(VarCurr,B)<->v480(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4512(VarCurr)<-> (v4513(VarCurr,bitIndex1)<->$T)& (v4513(VarCurr,bitIndex0)<->$T)).
% 121.44/120.41  all VarCurr (v4513(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.44/120.41  all VarCurr (v4513(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.44/120.41  all VarCurr (v4511(VarCurr,bitIndex0)<->$T).
% 121.44/120.41  all VarCurr B (range_5_1(B)-> (v4511(VarCurr,B)<->v4497(VarCurr,B))).
% 121.44/120.41  all B (range_5_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B).
% 121.44/120.41  all VarCurr (v4509(VarCurr)<-> (v4510(VarCurr,bitIndex1)<->$T)& (v4510(VarCurr,bitIndex0)<->$F)).
% 121.44/120.41  all VarCurr (v4510(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.44/120.41  all VarCurr (v4510(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.44/120.41  all VarCurr ((v4507(VarCurr,bitIndex4)<->v480(VarCurr,bitIndex5))& (v4507(VarCurr,bitIndex3)<->v480(VarCurr,bitIndex4))& (v4507(VarCurr,bitIndex2)<->v480(VarCurr,bitIndex3))& (v4507(VarCurr,bitIndex1)<->v480(VarCurr,bitIndex2))& (v4507(VarCurr,bitIndex0)<->v480(VarCurr,bitIndex1))).
% 121.44/120.41  all VarCurr (v4507(VarCurr,bitIndex5)<->$F).
% 121.44/120.41  all VarCurr (v4505(VarCurr)<-> (v4506(VarCurr,bitIndex1)<->$F)& (v4506(VarCurr,bitIndex0)<->$T)).
% 121.44/120.41  all VarCurr (v4506(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.44/120.41  all VarCurr (v4506(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.44/120.41  all VarCurr (v4503(VarCurr)<-> (v4504(VarCurr,bitIndex1)<->$F)& (v4504(VarCurr,bitIndex0)<->$F)).
% 121.44/120.41  all VarCurr (v4504(VarCurr,bitIndex0)<->v554(VarCurr)).
% 121.44/120.41  all VarCurr (v4504(VarCurr,bitIndex1)<->v47(VarCurr)).
% 121.44/120.41  all VarCurr (-v4501(VarCurr)<->v43(VarCurr)).
% 121.44/120.41  all VarCurr (v4497(VarCurr,bitIndex1)<->v4498(VarCurr,bitIndex1)).
% 121.44/120.41  all VarCurr (v4498(VarCurr,bitIndex0)<->$F).
% 121.44/120.41  all VarCurr ((v4498(VarCurr,bitIndex5)<->v480(VarCurr,bitIndex4))& (v4498(VarCurr,bitIndex4)<->v480(VarCurr,bitIndex3))& (v4498(VarCurr,bitIndex3)<->v480(VarCurr,bitIndex2))& (v4498(VarCurr,bitIndex2)<->v480(VarCurr,bitIndex1))& (v4498(VarCurr,bitIndex1)<->v480(VarCurr,bitIndex0))).
% 121.44/120.41  all VarCurr (v554(VarCurr)<->v556(VarCurr)).
% 121.44/120.41  all VarCurr (-v4481(VarCurr)-> (v556(VarCurr)<->$F)).
% 121.44/120.41  all VarCurr (v4481(VarCurr)-> (v556(VarCurr)<->v4490(VarCurr))).
% 121.44/120.41  all VarCurr (-v4483(VarCurr)-> (v4490(VarCurr)<->$F)).
% 121.44/120.41  all VarCurr (v4483(VarCurr)-> (v4490(VarCurr)<->v4491(VarCurr))).
% 121.44/120.41  all VarCurr (-v4486(VarCurr)& -v4270(VarCurr)-> (v4491(VarCurr)<->$T)).
% 121.44/120.41  all VarCurr (v4270(VarCurr)-> (v4491(VarCurr)<->v4493(VarCurr))).
% 121.44/120.41  all VarCurr (v4486(VarCurr)-> (v4491(VarCurr)<->v4492(VarCurr))).
% 121.44/120.41  all VarCurr (-v666(VarCurr)-> (v4493(VarCurr)<->$F)).
% 121.44/120.41  all VarCurr (v666(VarCurr)-> (v4493(VarCurr)<->$T)).
% 121.44/120.41  all VarCurr (-v4421(VarCurr)-> (v4492(VarCurr)<->$F)).
% 121.44/120.41  all VarCurr (v4421(VarCurr)-> (v4492(VarCurr)<->$T)).
% 121.44/120.41  all VarCurr (v4481(VarCurr)<->v4482(VarCurr)&v4489(VarCurr)).
% 121.44/120.41  all VarCurr (-v4489(VarCurr)<->v4274(VarCurr)).
% 121.44/120.41  all VarCurr (v4482(VarCurr)<->v4483(VarCurr)|v4488(VarCurr)).
% 121.44/120.41  all VarCurr (-v4488(VarCurr)<->v4271(VarCurr)).
% 121.44/120.41  all VarCurr (v4483(VarCurr)<->v4484(VarCurr)&v4271(VarCurr)).
% 121.44/120.41  all VarCurr (v4484(VarCurr)<->v4485(VarCurr)|v4487(VarCurr)).
% 121.44/120.41  all VarCurr (v4487(VarCurr)<-> (v37(VarCurr,bitIndex1)<->$T)& (v37(VarCurr,bitIndex0)<->$F)).
% 121.44/120.41  all VarCurr (v4485(VarCurr)<->v4486(VarCurr)|v4270(VarCurr)).
% 121.44/120.41  all VarCurr (v4486(VarCurr)<-> (v37(VarCurr,bitIndex1)<->$F)& (v37(VarCurr,bitIndex0)<->$F)).
% 121.44/120.41  all VarCurr (v4421(VarCurr)<->v4458(VarCurr)|v4423(VarCurr,bitIndex2)).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4466(VarNext)-> (all B (range_2_0(B)-> (v4423(VarNext,B)<->v4423(VarCurr,B)))))).
% 121.44/120.41  all VarNext (v4466(VarNext)-> (all B (range_2_0(B)-> (v4423(VarNext,B)<->v4474(VarNext,B))))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v4474(VarNext,B)<->v4472(VarCurr,B))))).
% 121.44/120.41  all VarCurr (-v4475(VarCurr)-> (all B (range_2_0(B)-> (v4472(VarCurr,B)<->v4425(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4475(VarCurr)-> (all B (range_2_0(B)-> (v4472(VarCurr,B)<->b100(B))))).
% 121.44/120.41  all VarCurr (-v4475(VarCurr)<->v45(VarCurr)).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4466(VarNext)<->v4467(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4467(VarNext)<->v4468(VarNext)&v4326(VarNext))).
% 121.44/120.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4468(VarNext)<->v4333(VarNext))).
% 121.44/120.41  all VarCurr (-v4431(VarCurr)& -v4447(VarCurr)-> (all B (range_2_0(B)-> (v4425(VarCurr,B)<->v4423(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4447(VarCurr)-> (all B (range_2_0(B)-> (v4425(VarCurr,B)<->v4449(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4431(VarCurr)-> (all B (range_2_0(B)-> (v4425(VarCurr,B)<->v4433(VarCurr,B))))).
% 121.44/120.41  all VarCurr (v4459(VarCurr)<->v4460(VarCurr)|v4462(VarCurr)).
% 121.44/120.41  all VarCurr (v4462(VarCurr)<-> (v4463(VarCurr,bitIndex1)<->$T)& (v4463(VarCurr,bitIndex0)<->$T)).
% 121.44/120.41  all VarCurr (v4463(VarCurr,bitIndex0)<->v35(VarCurr)).
% 121.44/120.41  all VarCurr (v4463(VarCurr,bitIndex1)<->v4427(VarCurr)).
% 121.44/120.41  all VarCurr (v4460(VarCurr)<-> (v4461(VarCurr,bitIndex1)<->$F)& (v4461(VarCurr,bitIndex0)<->$F)).
% 121.44/120.41  all VarCurr (v4461(VarCurr,bitIndex0)<->v35(VarCurr)).
% 121.44/120.41  all VarCurr (v4461(VarCurr,bitIndex1)<->v4427(VarCurr)).
% 121.44/120.41  all VarCurr (v4449(VarCurr,bitIndex0)<->v4445(VarCurr)).
% 121.44/120.41  all VarCurr (v4449(VarCurr,bitIndex1)<->v4456(VarCurr)).
% 121.44/120.41  all VarCurr (v4449(VarCurr,bitIndex2)<->v4451(VarCurr)).
% 121.44/120.41  all VarCurr (v4456(VarCurr)<->v4457(VarCurr)&v4458(VarCurr)).
% 121.44/120.41  all VarCurr (v4458(VarCurr)<->v4423(VarCurr,bitIndex0)|v4423(VarCurr,bitIndex1)).
% 121.44/120.41  all VarCurr (v4457(VarCurr)<->v4445(VarCurr)|v4440(VarCurr)).
% 121.44/120.41  all VarCurr (v4451(VarCurr)<->v4452(VarCurr)&v4455(VarCurr)).
% 121.44/120.41  all VarCurr (v4455(VarCurr)<->v4423(VarCurr,bitIndex2)|v4454(VarCurr)).
% 121.44/120.41  all VarCurr (v4452(VarCurr)<->v4442(VarCurr)|v4453(VarCurr)).
% 121.44/120.41  all VarCurr (-v4453(VarCurr)<->v4454(VarCurr)).
% 121.44/120.41  all VarCurr (v4454(VarCurr)<->v4423(VarCurr,bitIndex0)&v4423(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (v4447(VarCurr)<-> (v4448(VarCurr,bitIndex1)<->$T)& (v4448(VarCurr,bitIndex0)<->$F)).
% 121.44/120.42  all VarCurr (v4448(VarCurr,bitIndex0)<->v35(VarCurr)).
% 121.44/120.42  all VarCurr (v4448(VarCurr,bitIndex1)<->v4427(VarCurr)).
% 121.44/120.42  all VarCurr (v4433(VarCurr,bitIndex0)<->v4445(VarCurr)).
% 121.44/120.42  all VarCurr (v4433(VarCurr,bitIndex1)<->v4443(VarCurr)).
% 121.44/120.42  all VarCurr (v4433(VarCurr,bitIndex2)<->v4435(VarCurr)).
% 121.44/120.42  all VarCurr (v4443(VarCurr)<->v4444(VarCurr)&v4446(VarCurr)).
% 121.44/120.42  all VarCurr (v4446(VarCurr)<->v4423(VarCurr,bitIndex0)|v4440(VarCurr)).
% 121.44/120.42  all VarCurr (v4444(VarCurr)<->v4445(VarCurr)|v4423(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (-v4445(VarCurr)<->v4423(VarCurr,bitIndex0)).
% 121.44/120.42  all VarCurr (v4435(VarCurr)<->v4436(VarCurr)&v4441(VarCurr)).
% 121.44/120.42  all VarCurr (v4441(VarCurr)<->v4438(VarCurr)|v4442(VarCurr)).
% 121.44/120.42  all VarCurr (-v4442(VarCurr)<->v4423(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (v4436(VarCurr)<->v4437(VarCurr)|v4423(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (-v4437(VarCurr)<->v4438(VarCurr)).
% 121.44/120.42  all VarCurr (v4438(VarCurr)<->v4423(VarCurr,bitIndex1)|v4439(VarCurr)).
% 121.44/120.42  all VarCurr (v4439(VarCurr)<->v4423(VarCurr,bitIndex0)&v4440(VarCurr)).
% 121.44/120.42  all VarCurr (-v4440(VarCurr)<->v4423(VarCurr,bitIndex1)).
% 121.44/120.42  v4423(constB0,bitIndex2).
% 121.44/120.42  -v4423(constB0,bitIndex1).
% 121.44/120.42  -v4423(constB0,bitIndex0).
% 121.44/120.42  all VarCurr (v4431(VarCurr)<-> (v4432(VarCurr,bitIndex1)<->$F)& (v4432(VarCurr,bitIndex0)<->$T)).
% 121.44/120.42  all VarCurr (v4432(VarCurr,bitIndex0)<->v35(VarCurr)).
% 121.44/120.42  all VarCurr (v4432(VarCurr,bitIndex1)<->v4427(VarCurr)).
% 121.44/120.42  all VarCurr (v4427(VarCurr)<->v4429(VarCurr)).
% 121.44/120.42  all VarCurr (v4429(VarCurr)<->v2157(VarCurr)).
% 121.44/120.42  all VarCurr (v560(VarCurr)<->v562(VarCurr)).
% 121.44/120.42  all VarCurr (v562(VarCurr)<-> (v564(VarCurr,bitIndex2)<->$F)& (v564(VarCurr,bitIndex1)<->$F)& (v564(VarCurr,bitIndex0)<->$F)).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4405(VarNext)-> (all B (range_2_0(B)-> (v564(VarNext,B)<->v564(VarCurr,B)))))).
% 121.44/120.42  all VarNext (v4405(VarNext)-> (all B (range_2_0(B)-> (v564(VarNext,B)<->v4415(VarNext,B))))).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v4415(VarNext,B)<->v4413(VarCurr,B))))).
% 121.44/120.42  all VarCurr (-v4416(VarCurr)-> (all B (range_2_0(B)-> (v4413(VarCurr,B)<->v569(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4416(VarCurr)-> (all B (range_2_0(B)-> (v4413(VarCurr,B)<->$F)))).
% 121.44/120.42  all VarCurr (-v4416(VarCurr)<->v566(VarCurr)).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4405(VarNext)<->v4406(VarNext))).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4406(VarNext)<->v4407(VarNext)&v4402(VarNext))).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4407(VarNext)<->v4409(VarNext))).
% 121.44/120.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4409(VarNext)<->v4402(VarCurr))).
% 121.44/120.42  all VarCurr (v4402(VarCurr)<->v4326(VarCurr)).
% 121.44/120.42  all VarCurr (-v4353(VarCurr)& -v4355(VarCurr)& -v4382(VarCurr)-> (all B (range_2_0(B)-> (v569(VarCurr,B)<->v564(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4382(VarCurr)-> (all B (range_2_0(B)-> (v569(VarCurr,B)<->v4384(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4355(VarCurr)-> (all B (range_2_0(B)-> (v569(VarCurr,B)<->v4357(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4353(VarCurr)-> (all B (range_2_0(B)-> (v569(VarCurr,B)<->v564(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4399(VarCurr)<-> (v4400(VarCurr,bitIndex1)<->$T)& (v4400(VarCurr,bitIndex0)<->$T)).
% 121.44/120.42  all VarCurr (v4400(VarCurr,bitIndex0)<->v660(VarCurr)).
% 121.44/120.42  all VarCurr (v4400(VarCurr,bitIndex1)<->v571(VarCurr)).
% 121.44/120.42  all VarCurr (-v4385(VarCurr)-> (all B (range_2_0(B)-> (v4384(VarCurr,B)<->v4386(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4385(VarCurr)-> (all B (range_2_0(B)-> (v4384(VarCurr,B)<->b100(B))))).
% 121.44/120.42  all VarCurr (v4386(VarCurr,bitIndex0)<->v4396(VarCurr)).
% 121.44/120.42  all VarCurr (v4386(VarCurr,bitIndex1)<->v4394(VarCurr)).
% 121.44/120.42  all VarCurr (v4386(VarCurr,bitIndex2)<->v4388(VarCurr)).
% 121.44/120.42  all VarCurr (v4394(VarCurr)<->v4395(VarCurr)&v4398(VarCurr)).
% 121.44/120.42  all VarCurr (v4398(VarCurr)<->v564(VarCurr,bitIndex0)|v564(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (v4395(VarCurr)<->v4396(VarCurr)|v4397(VarCurr)).
% 121.44/120.42  all VarCurr (-v4397(VarCurr)<->v564(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (-v4396(VarCurr)<->v564(VarCurr,bitIndex0)).
% 121.44/120.42  all VarCurr (v4388(VarCurr)<->v4389(VarCurr)&v4393(VarCurr)).
% 121.44/120.42  all VarCurr (v4393(VarCurr)<->v4391(VarCurr)|v564(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (v4389(VarCurr)<->v4390(VarCurr)|v4392(VarCurr)).
% 121.44/120.42  all VarCurr (-v4392(VarCurr)<->v564(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (-v4390(VarCurr)<->v4391(VarCurr)).
% 121.44/120.42  all VarCurr (v4391(VarCurr)<->v564(VarCurr,bitIndex0)&v564(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (v4385(VarCurr)<-> (v564(VarCurr,bitIndex2)<->$T)& (v564(VarCurr,bitIndex1)<->$F)& (v564(VarCurr,bitIndex0)<->$F)).
% 121.44/120.42  all VarCurr (v4382(VarCurr)<-> (v4383(VarCurr,bitIndex1)<->$T)& (v4383(VarCurr,bitIndex0)<->$F)).
% 121.44/120.42  all VarCurr (v4383(VarCurr,bitIndex0)<->v660(VarCurr)).
% 121.44/120.42  all VarCurr (v4383(VarCurr,bitIndex1)<->v571(VarCurr)).
% 121.44/120.42  all VarCurr (-v4358(VarCurr)-> (all B (range_31_0(B)-> (v4357(VarCurr,B)<->v4359(VarCurr,B))))).
% 121.44/120.42  all VarCurr (v4358(VarCurr)-> (all B (range_31_0(B)-> (v4357(VarCurr,B)<->$F)))).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex4)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex5)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex6)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex7)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex8)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex9)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex10)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex11)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex12)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex13)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex14)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex15)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex16)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex17)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex18)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex19)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex20)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex21)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex22)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex23)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex24)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex25)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex26)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex27)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex28)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex29)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex30)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4359(VarCurr,bitIndex31)<->v4360(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr B (range_3_0(B)-> (v4359(VarCurr,B)<->v4360(VarCurr,B))).
% 121.44/120.42  all VarCurr (v4360(VarCurr,bitIndex0)<->v4380(VarCurr)).
% 121.44/120.42  all VarCurr (v4360(VarCurr,bitIndex1)<->v4378(VarCurr)).
% 121.44/120.42  all VarCurr (v4360(VarCurr,bitIndex2)<->v4374(VarCurr)).
% 121.44/120.42  all VarCurr (v4360(VarCurr,bitIndex3)<->v4362(VarCurr)).
% 121.44/120.42  all VarCurr (v4378(VarCurr)<->v4379(VarCurr)&v4381(VarCurr)).
% 121.44/120.42  all VarCurr (v4381(VarCurr)<->v4366(VarCurr,bitIndex0)|v4370(VarCurr)).
% 121.44/120.42  all VarCurr (v4379(VarCurr)<->v4380(VarCurr)|v4366(VarCurr,bitIndex1)).
% 121.44/120.42  all VarCurr (-v4380(VarCurr)<->v4366(VarCurr,bitIndex0)).
% 121.44/120.42  all VarCurr (v4374(VarCurr)<->v4375(VarCurr)&v4377(VarCurr)).
% 121.44/120.42  all VarCurr (v4377(VarCurr)<->v4368(VarCurr)|v4371(VarCurr)).
% 121.44/120.42  all VarCurr (v4375(VarCurr)<->v4376(VarCurr)|v4366(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (-v4376(VarCurr)<->v4368(VarCurr)).
% 121.44/120.42  all VarCurr (v4362(VarCurr)<->v4363(VarCurr)&v4372(VarCurr)).
% 121.44/120.42  all VarCurr (v4372(VarCurr)<->v4365(VarCurr)|v4373(VarCurr)).
% 121.44/120.42  all VarCurr (-v4373(VarCurr)<->v4366(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (v4363(VarCurr)<->v4364(VarCurr)|v4366(VarCurr,bitIndex3)).
% 121.44/120.42  all VarCurr (-v4364(VarCurr)<->v4365(VarCurr)).
% 121.44/120.42  all VarCurr (v4365(VarCurr)<->v4366(VarCurr,bitIndex2)|v4367(VarCurr)).
% 121.44/120.42  all VarCurr (v4367(VarCurr)<->v4368(VarCurr)&v4371(VarCurr)).
% 121.44/120.42  all VarCurr (-v4371(VarCurr)<->v4366(VarCurr,bitIndex2)).
% 121.44/120.42  all VarCurr (v4368(VarCurr)<->v4366(VarCurr,bitIndex1)|v4369(VarCurr)).
% 121.44/120.42  all VarCurr (v4369(VarCurr)<->v4366(VarCurr,bitIndex0)&v4370(VarCurr)).
% 121.44/120.43  all VarCurr (-v4370(VarCurr)<->v4366(VarCurr,bitIndex1)).
% 121.44/120.43  all VarCurr (-v4366(VarCurr,bitIndex3)).
% 121.44/120.43  all VarCurr B (range_2_0(B)-> (v4366(VarCurr,B)<->v564(VarCurr,B))).
% 121.44/120.43  all VarCurr (v4358(VarCurr)<-> (v564(VarCurr,bitIndex2)<->$F)& (v564(VarCurr,bitIndex1)<->$F)& (v564(VarCurr,bitIndex0)<->$F)).
% 121.44/120.43  all VarCurr (v4355(VarCurr)<-> (v4356(VarCurr,bitIndex1)<->$F)& (v4356(VarCurr,bitIndex0)<->$T)).
% 121.44/120.43  all VarCurr (v4356(VarCurr,bitIndex0)<->v660(VarCurr)).
% 121.44/120.43  all VarCurr (v4356(VarCurr,bitIndex1)<->v571(VarCurr)).
% 121.44/120.43  all B (range_2_0(B)-> (v564(constB0,B)<->$F)).
% 121.44/120.43  all VarCurr (v4353(VarCurr)<-> (v4354(VarCurr,bitIndex1)<->$F)& (v4354(VarCurr,bitIndex0)<->$F)).
% 121.44/120.43  all VarCurr (v4354(VarCurr,bitIndex0)<->v660(VarCurr)).
% 121.44/120.43  all VarCurr (v4354(VarCurr,bitIndex1)<->v571(VarCurr)).
% 121.44/120.43  all VarCurr (v660(VarCurr)<->v662(VarCurr)).
% 121.44/120.43  all VarCurr (v662(VarCurr)<->v664(VarCurr)).
% 121.44/120.43  all VarCurr (-v4347(VarCurr)-> (v664(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v4347(VarCurr)-> (v664(VarCurr)<->v4351(VarCurr))).
% 121.44/120.43  all VarCurr (-v4274(VarCurr)-> (v4351(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v4274(VarCurr)-> (v4351(VarCurr)<->$T)).
% 121.44/120.43  all VarCurr (v4347(VarCurr)<->v4274(VarCurr)|v4348(VarCurr)).
% 121.44/120.43  all VarCurr (v4348(VarCurr)<->v4349(VarCurr)&v4350(VarCurr)).
% 121.44/120.43  all VarCurr (-v4350(VarCurr)<->v4274(VarCurr)).
% 121.44/120.43  all VarCurr (-v4349(VarCurr)<->v4271(VarCurr)).
% 121.44/120.43  all VarCurr (v666(VarCurr)<->v4345(VarCurr)|v668(VarCurr,bitIndex3)).
% 121.44/120.43  all VarCurr (v4345(VarCurr)<->v4319(VarCurr)|v668(VarCurr,bitIndex2)).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4329(VarNext)-> (all B (range_3_0(B)-> (v668(VarNext,B)<->v668(VarCurr,B)))))).
% 121.44/120.43  all VarNext (v4329(VarNext)-> (all B (range_3_0(B)-> (v668(VarNext,B)<->v4339(VarNext,B))))).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v4339(VarNext,B)<->v4337(VarCurr,B))))).
% 121.44/120.43  all VarCurr (-v4340(VarCurr)-> (all B (range_3_0(B)-> (v4337(VarCurr,B)<->v671(VarCurr,B))))).
% 121.44/120.43  all VarCurr (v4340(VarCurr)-> (all B (range_3_0(B)-> (v4337(VarCurr,B)<->b1000(B))))).
% 121.44/120.43  all VarCurr (-v4340(VarCurr)<->v45(VarCurr)).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4329(VarNext)<->v4330(VarNext))).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4330(VarNext)<->v4331(VarNext)&v4326(VarNext))).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4331(VarNext)<->v4333(VarNext))).
% 121.44/120.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4333(VarNext)<->v4326(VarCurr))).
% 121.44/120.43  all VarCurr (v4326(VarCurr)<->v803(VarCurr)).
% 121.44/120.43  all VarCurr (-v4280(VarCurr)& -v4303(VarCurr)-> (all B (range_3_0(B)-> (v671(VarCurr,B)<->v668(VarCurr,B))))).
% 121.44/120.43  all VarCurr (v4303(VarCurr)-> (all B (range_3_0(B)-> (v671(VarCurr,B)<->v4305(VarCurr,B))))).
% 121.44/120.43  all VarCurr (v4280(VarCurr)-> (all B (range_3_0(B)-> (v671(VarCurr,B)<->v4282(VarCurr,B))))).
% 121.44/120.43  all VarCurr (v4320(VarCurr)<->v4321(VarCurr)|v4323(VarCurr)).
% 121.44/120.43  all VarCurr (v4323(VarCurr)<-> (v4324(VarCurr,bitIndex1)<->$T)& (v4324(VarCurr,bitIndex0)<->$T)).
% 121.44/120.43  all VarCurr (v4324(VarCurr,bitIndex0)<->v4263(VarCurr)).
% 121.44/120.43  all VarCurr (v4324(VarCurr,bitIndex1)<->v673(VarCurr)).
% 121.44/120.43  all VarCurr (v4321(VarCurr)<-> (v4322(VarCurr,bitIndex1)<->$F)& (v4322(VarCurr,bitIndex0)<->$F)).
% 121.44/120.43  all VarCurr (v4322(VarCurr,bitIndex0)<->v4263(VarCurr)).
% 121.44/120.43  all VarCurr (v4322(VarCurr,bitIndex1)<->v673(VarCurr)).
% 121.44/120.43  all VarCurr (v4305(VarCurr,bitIndex0)<->v4301(VarCurr)).
% 121.44/120.43  all VarCurr (v4305(VarCurr,bitIndex1)<->v4317(VarCurr)).
% 121.44/120.43  all VarCurr (v4305(VarCurr,bitIndex2)<->v4313(VarCurr)).
% 121.44/120.43  all VarCurr (v4305(VarCurr,bitIndex3)<->v4307(VarCurr)).
% 121.44/120.43  all VarCurr (v4317(VarCurr)<->v4318(VarCurr)&v4319(VarCurr)).
% 121.44/120.43  all VarCurr (v4319(VarCurr)<->v668(VarCurr,bitIndex0)|v668(VarCurr,bitIndex1)).
% 121.44/120.43  all VarCurr (v4318(VarCurr)<->v4301(VarCurr)|v4291(VarCurr)).
% 121.44/120.43  all VarCurr (v4313(VarCurr)<->v4314(VarCurr)&v4316(VarCurr)).
% 121.44/120.43  all VarCurr (v4316(VarCurr)<->v668(VarCurr,bitIndex2)|v4311(VarCurr)).
% 121.44/120.43  all VarCurr (v4314(VarCurr)<->v4292(VarCurr)|v4315(VarCurr)).
% 121.44/120.43  all VarCurr (-v4315(VarCurr)<->v4311(VarCurr)).
% 121.44/120.43  all VarCurr (v4307(VarCurr)<->v4308(VarCurr)&v4312(VarCurr)).
% 121.44/120.43  all VarCurr (v4312(VarCurr)<->v668(VarCurr,bitIndex3)|v4310(VarCurr)).
% 121.44/120.43  all VarCurr (v4308(VarCurr)<->v4294(VarCurr)|v4309(VarCurr)).
% 121.44/120.43  all VarCurr (-v4309(VarCurr)<->v4310(VarCurr)).
% 121.44/120.43  all VarCurr (v4310(VarCurr)<->v668(VarCurr,bitIndex2)&v4311(VarCurr)).
% 121.44/120.43  all VarCurr (v4311(VarCurr)<->v668(VarCurr,bitIndex0)&v668(VarCurr,bitIndex1)).
% 121.44/120.43  all VarCurr (v4303(VarCurr)<-> (v4304(VarCurr,bitIndex1)<->$T)& (v4304(VarCurr,bitIndex0)<->$F)).
% 121.44/120.43  all VarCurr (v4304(VarCurr,bitIndex0)<->v4263(VarCurr)).
% 121.44/120.43  all VarCurr (v4304(VarCurr,bitIndex1)<->v673(VarCurr)).
% 121.44/120.43  all VarCurr (v4282(VarCurr,bitIndex0)<->v4301(VarCurr)).
% 121.44/120.43  all VarCurr (v4282(VarCurr,bitIndex1)<->v4299(VarCurr)).
% 121.44/120.43  all VarCurr (v4282(VarCurr,bitIndex2)<->v4295(VarCurr)).
% 121.44/120.43  all VarCurr (v4282(VarCurr,bitIndex3)<->v4284(VarCurr)).
% 121.44/120.43  all VarCurr (v4299(VarCurr)<->v4300(VarCurr)&v4302(VarCurr)).
% 121.44/120.43  all VarCurr (v4302(VarCurr)<->v668(VarCurr,bitIndex0)|v4291(VarCurr)).
% 121.44/120.43  all VarCurr (v4300(VarCurr)<->v4301(VarCurr)|v668(VarCurr,bitIndex1)).
% 121.44/120.43  all VarCurr (-v4301(VarCurr)<->v668(VarCurr,bitIndex0)).
% 121.44/120.43  all VarCurr (v4295(VarCurr)<->v4296(VarCurr)&v4298(VarCurr)).
% 121.44/120.43  all VarCurr (v4298(VarCurr)<->v4289(VarCurr)|v4292(VarCurr)).
% 121.44/120.43  all VarCurr (v4296(VarCurr)<->v4297(VarCurr)|v668(VarCurr,bitIndex2)).
% 121.44/120.43  all VarCurr (-v4297(VarCurr)<->v4289(VarCurr)).
% 121.44/120.43  all VarCurr (v4284(VarCurr)<->v4285(VarCurr)&v4293(VarCurr)).
% 121.44/120.43  all VarCurr (v4293(VarCurr)<->v4287(VarCurr)|v4294(VarCurr)).
% 121.44/120.43  all VarCurr (-v4294(VarCurr)<->v668(VarCurr,bitIndex3)).
% 121.44/120.43  all VarCurr (v4285(VarCurr)<->v4286(VarCurr)|v668(VarCurr,bitIndex3)).
% 121.44/120.43  all VarCurr (-v4286(VarCurr)<->v4287(VarCurr)).
% 121.44/120.43  all VarCurr (v4287(VarCurr)<->v668(VarCurr,bitIndex2)|v4288(VarCurr)).
% 121.44/120.43  all VarCurr (v4288(VarCurr)<->v4289(VarCurr)&v4292(VarCurr)).
% 121.44/120.43  all VarCurr (-v4292(VarCurr)<->v668(VarCurr,bitIndex2)).
% 121.44/120.43  all VarCurr (v4289(VarCurr)<->v668(VarCurr,bitIndex1)|v4290(VarCurr)).
% 121.44/120.43  all VarCurr (v4290(VarCurr)<->v668(VarCurr,bitIndex0)&v4291(VarCurr)).
% 121.44/120.43  all VarCurr (-v4291(VarCurr)<->v668(VarCurr,bitIndex1)).
% 121.44/120.43  v668(constB0,bitIndex3).
% 121.44/120.43  -v668(constB0,bitIndex2).
% 121.44/120.43  -v668(constB0,bitIndex1).
% 121.44/120.43  -v668(constB0,bitIndex0).
% 121.44/120.43  all VarCurr (v4280(VarCurr)<-> (v4281(VarCurr,bitIndex1)<->$F)& (v4281(VarCurr,bitIndex0)<->$T)).
% 121.44/120.43  all VarCurr (v4281(VarCurr,bitIndex0)<->v4263(VarCurr)).
% 121.44/120.43  all VarCurr (v4281(VarCurr,bitIndex1)<->v673(VarCurr)).
% 121.44/120.43  all VarCurr (v4263(VarCurr)<->v664(VarCurr)|v4265(VarCurr)).
% 121.44/120.43  all VarCurr (-v4267(VarCurr)-> (v4265(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v4267(VarCurr)-> (v4265(VarCurr)<->v4276(VarCurr))).
% 121.44/120.43  all VarCurr (-v4269(VarCurr)-> (v4276(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v4269(VarCurr)-> (v4276(VarCurr)<->v4277(VarCurr))).
% 121.44/120.43  all VarCurr (-v666(VarCurr)-> (v4277(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v666(VarCurr)-> (v4277(VarCurr)<->$T)).
% 121.44/120.43  all VarCurr (v4267(VarCurr)<->v4268(VarCurr)&v4273(VarCurr)).
% 121.44/120.43  all VarCurr (-v4273(VarCurr)<->v4274(VarCurr)).
% 121.44/120.43  all VarCurr (v4274(VarCurr)<->v4275(VarCurr)&v666(VarCurr)).
% 121.44/120.43  all VarCurr (-v4275(VarCurr)<->v560(VarCurr)).
% 121.44/120.43  all VarCurr (v4268(VarCurr)<->v4269(VarCurr)|v4272(VarCurr)).
% 121.44/120.43  all VarCurr (-v4272(VarCurr)<->v4271(VarCurr)).
% 121.44/120.43  all VarCurr (v4269(VarCurr)<->v4270(VarCurr)&v4271(VarCurr)).
% 121.44/120.43  all VarCurr (-v4271(VarCurr)<->v39(VarCurr)).
% 121.44/120.43  all VarCurr (v4270(VarCurr)<-> (v37(VarCurr,bitIndex1)<->$F)& (v37(VarCurr,bitIndex0)<->$T)).
% 121.44/120.43  all VarCurr (v673(VarCurr)<->v675(VarCurr)).
% 121.44/120.43  all VarCurr (v675(VarCurr)<->v677(VarCurr)).
% 121.44/120.43  all VarCurr (v677(VarCurr)<->v679(VarCurr)).
% 121.44/120.43  all VarCurr (-v4258(VarCurr)-> (v679(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v4258(VarCurr)-> (v679(VarCurr)<->v4261(VarCurr))).
% 121.44/120.43  all VarCurr (-v1214(VarCurr)-> (v4261(VarCurr)<->$F)).
% 121.44/120.43  all VarCurr (v1214(VarCurr)-> (v4261(VarCurr)<->$T)).
% 121.44/120.43  all VarCurr (v4258(VarCurr)<->v1214(VarCurr)|v4259(VarCurr)).
% 121.44/120.43  all VarCurr (-v4259(VarCurr)<->v4260(VarCurr)).
% 121.44/120.43  all VarCurr (v4260(VarCurr)<->v1210(VarCurr)|v1214(VarCurr)).
% 121.44/120.43  all VarCurr (v681(VarCurr)<->v683(VarCurr)).
% 121.44/120.43  all VarCurr (v683(VarCurr)<->v685(VarCurr)).
% 121.44/120.43  all VarCurr (v685(VarCurr)<->v687(VarCurr)).
% 121.44/120.43  all VarCurr (v687(VarCurr)<->v689(VarCurr)).
% 121.44/120.43  all VarCurr (v689(VarCurr)<->v691(VarCurr)).
% 121.44/120.43  all VarCurr (v691(VarCurr)<->v693(VarCurr)).
% 121.44/120.43  all VarCurr (v693(VarCurr)<->v695(VarCurr)).
% 121.44/120.43  all VarCurr (v695(VarCurr)<->v697(VarCurr)).
% 121.44/120.43  all VarCurr (v697(VarCurr)<->v699(VarCurr)).
% 121.44/120.43  all VarCurr (v699(VarCurr)<->v701(VarCurr)|v4186(VarCurr)).
% 121.44/120.44  all VarCurr (v4186(VarCurr)<->v4188(VarCurr,bitIndex6)).
% 121.44/120.44  all VarCurr (v4188(VarCurr,bitIndex6)<->v4190(VarCurr,bitIndex6)).
% 121.44/120.44  all VarCurr (v4190(VarCurr,bitIndex6)<->v4192(VarCurr,bitIndex6)).
% 121.44/120.44  all VarNext (v4192(VarNext,bitIndex6)<->v4249(VarNext,bitIndex6)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4250(VarNext)-> (all B (range_7_0(B)-> (v4249(VarNext,B)<->v4192(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4250(VarNext)-> (all B (range_7_0(B)-> (v4249(VarNext,B)<->v4236(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4250(VarNext)<->v4251(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4251(VarNext)<->v4253(VarNext)&v4223(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4253(VarNext)<->v4230(VarNext))).
% 121.44/120.44  all VarCurr (v4196(VarCurr,bitIndex6)<->v4209(VarCurr,bitIndex6)).
% 121.44/120.44  all VarNext (v4192(VarNext,bitIndex5)<->v4241(VarNext,bitIndex5)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4242(VarNext)-> (all B (range_7_0(B)-> (v4241(VarNext,B)<->v4192(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4242(VarNext)-> (all B (range_7_0(B)-> (v4241(VarNext,B)<->v4236(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4242(VarNext)<->v4243(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4243(VarNext)<->v4245(VarNext)&v4223(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4245(VarNext)<->v4230(VarNext))).
% 121.44/120.44  all VarCurr (v4196(VarCurr,bitIndex5)<->v4209(VarCurr,bitIndex5)).
% 121.44/120.44  all VarNext (v4192(VarNext,bitIndex7)<->v4225(VarNext,bitIndex7)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4226(VarNext)-> (all B (range_7_0(B)-> (v4225(VarNext,B)<->v4192(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4226(VarNext)-> (all B (range_7_0(B)-> (v4225(VarNext,B)<->v4236(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_7_0(B)-> (v4236(VarNext,B)<->v4234(VarCurr,B))))).
% 121.44/120.44  all VarCurr (-v4237(VarCurr)-> (all B (range_7_0(B)-> (v4234(VarCurr,B)<->v4196(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4237(VarCurr)-> (all B (range_7_0(B)-> (v4234(VarCurr,B)<->$F)))).
% 121.44/120.44  all VarCurr (-v4237(VarCurr)<->v4194(VarCurr)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4226(VarNext)<->v4227(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4227(VarNext)<->v4228(VarNext)&v4223(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4228(VarNext)<->v4230(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4230(VarNext)<->v4223(VarCurr))).
% 121.44/120.44  all VarCurr (v4223(VarCurr)<->v2549(VarCurr)).
% 121.44/120.44  all VarCurr (v4196(VarCurr,bitIndex7)<->v4209(VarCurr,bitIndex7)).
% 121.44/120.44  all VarCurr (-v4210(VarCurr)& -v4214(VarCurr)& -v4217(VarCurr)-> (all B (range_7_0(B)-> (v4209(VarCurr,B)<->v4192(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4217(VarCurr)-> (all B (range_7_0(B)-> (v4209(VarCurr,B)<->v4219(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4214(VarCurr)-> (all B (range_7_0(B)-> (v4209(VarCurr,B)<->v4216(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4210(VarCurr)-> (all B (range_7_0(B)-> (v4209(VarCurr,B)<->v4192(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4220(VarCurr)<-> (v4221(VarCurr,bitIndex1)<->$T)& (v4221(VarCurr,bitIndex0)<->$T)).
% 121.44/120.44  all VarCurr (v4221(VarCurr,bitIndex0)<->v4204(VarCurr)).
% 121.44/120.44  all VarCurr (v4221(VarCurr,bitIndex1)<->v4198(VarCurr)).
% 121.44/120.44  all VarCurr (v4219(VarCurr,bitIndex0)<->$T).
% 121.44/120.44  all VarCurr ((v4219(VarCurr,bitIndex7)<->v4192(VarCurr,bitIndex6))& (v4219(VarCurr,bitIndex6)<->v4192(VarCurr,bitIndex5))& (v4219(VarCurr,bitIndex5)<->v4192(VarCurr,bitIndex4))& (v4219(VarCurr,bitIndex4)<->v4192(VarCurr,bitIndex3))& (v4219(VarCurr,bitIndex3)<->v4192(VarCurr,bitIndex2))& (v4219(VarCurr,bitIndex2)<->v4192(VarCurr,bitIndex1))& (v4219(VarCurr,bitIndex1)<->v4192(VarCurr,bitIndex0))).
% 121.44/120.44  all VarCurr (v4217(VarCurr)<-> (v4218(VarCurr,bitIndex1)<->$T)& (v4218(VarCurr,bitIndex0)<->$F)).
% 121.44/120.44  all VarCurr (v4218(VarCurr,bitIndex0)<->v4204(VarCurr)).
% 121.44/120.44  all VarCurr (v4218(VarCurr,bitIndex1)<->v4198(VarCurr)).
% 121.44/120.44  all VarCurr ((v4216(VarCurr,bitIndex6)<->v4192(VarCurr,bitIndex7))& (v4216(VarCurr,bitIndex5)<->v4192(VarCurr,bitIndex6))& (v4216(VarCurr,bitIndex4)<->v4192(VarCurr,bitIndex5))& (v4216(VarCurr,bitIndex3)<->v4192(VarCurr,bitIndex4))& (v4216(VarCurr,bitIndex2)<->v4192(VarCurr,bitIndex3))& (v4216(VarCurr,bitIndex1)<->v4192(VarCurr,bitIndex2))& (v4216(VarCurr,bitIndex0)<->v4192(VarCurr,bitIndex1))).
% 121.44/120.44  all VarCurr (v4216(VarCurr,bitIndex7)<->$F).
% 121.44/120.44  all VarCurr (v4214(VarCurr)<-> (v4215(VarCurr,bitIndex1)<->$F)& (v4215(VarCurr,bitIndex0)<->$T)).
% 121.44/120.44  all VarCurr (v4215(VarCurr,bitIndex0)<->v4204(VarCurr)).
% 121.44/120.44  all VarCurr (v4215(VarCurr,bitIndex1)<->v4198(VarCurr)).
% 121.44/120.44  -v4192(constB0,bitIndex7).
% 121.44/120.44  -v4192(constB0,bitIndex6).
% 121.44/120.44  -v4192(constB0,bitIndex5).
% 121.44/120.44  -v4192(constB0,bitIndex4).
% 121.44/120.44  -b0000xxxx(bitIndex7).
% 121.44/120.44  -b0000xxxx(bitIndex6).
% 121.44/120.44  -b0000xxxx(bitIndex5).
% 121.44/120.44  -b0000xxxx(bitIndex4).
% 121.44/120.44  all VarCurr (v4210(VarCurr)<-> (v4211(VarCurr,bitIndex1)<->$F)& (v4211(VarCurr,bitIndex0)<->$F)).
% 121.44/120.44  all VarCurr (v4211(VarCurr,bitIndex0)<->v4204(VarCurr)).
% 121.44/120.44  all VarCurr (v4211(VarCurr,bitIndex1)<->v4198(VarCurr)).
% 121.44/120.44  all VarCurr (v4204(VarCurr)<->v4206(VarCurr)).
% 121.44/120.44  all VarCurr (v4206(VarCurr)<->v4208(VarCurr)).
% 121.44/120.44  all VarCurr (v4208(VarCurr)<->v3263(VarCurr)).
% 121.44/120.44  all VarCurr (v4198(VarCurr)<->v4200(VarCurr)).
% 121.44/120.44  all VarCurr (v4200(VarCurr)<->v4202(VarCurr)).
% 121.44/120.44  all VarCurr (v4202(VarCurr)<->v723(VarCurr)).
% 121.44/120.44  all VarCurr (v4194(VarCurr)<->v711(VarCurr)).
% 121.44/120.44  all VarCurr (v701(VarCurr)<->v703(VarCurr,bitIndex2)).
% 121.44/120.44  all VarCurr (v703(VarCurr,bitIndex2)<->v705(VarCurr,bitIndex2)).
% 121.44/120.44  all VarCurr (v705(VarCurr,bitIndex2)<->v707(VarCurr,bitIndex2)).
% 121.44/120.44  all VarNext (v707(VarNext,bitIndex2)<->v4178(VarNext,bitIndex2)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4179(VarNext)-> (all B (range_3_0(B)-> (v4178(VarNext,B)<->v707(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4179(VarNext)-> (all B (range_3_0(B)-> (v4178(VarNext,B)<->v2564(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4179(VarNext)<->v4180(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4180(VarNext)<->v4182(VarNext)&v2547(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4182(VarNext)<->v2558(VarNext))).
% 121.44/120.44  all VarCurr (v715(VarCurr,bitIndex2)<->v2535(VarCurr,bitIndex2)).
% 121.44/120.44  all VarNext (v707(VarNext,bitIndex3)<->v4170(VarNext,bitIndex3)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4171(VarNext)-> (all B (range_3_0(B)-> (v4170(VarNext,B)<->v707(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4171(VarNext)-> (all B (range_3_0(B)-> (v4170(VarNext,B)<->v2564(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4171(VarNext)<->v4172(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4172(VarNext)<->v4174(VarNext)&v2547(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4174(VarNext)<->v2558(VarNext))).
% 121.44/120.44  all VarCurr (v715(VarCurr,bitIndex3)<->v2535(VarCurr,bitIndex3)).
% 121.44/120.44  all VarCurr (v2530(VarCurr)<->v2532(VarCurr)).
% 121.44/120.44  all VarCurr (v2532(VarCurr)<->v2534(VarCurr)).
% 121.44/120.44  all VarCurr (v2534(VarCurr)<->v703(VarCurr,bitIndex0)&v4168(VarCurr)).
% 121.44/120.44  all VarCurr (-v4168(VarCurr)<->v2577(VarCurr)).
% 121.44/120.44  all VarCurr (v2577(VarCurr)<->v2579(VarCurr)).
% 121.44/120.44  all VarCurr (v2579(VarCurr)<->v2581(VarCurr)).
% 121.44/120.44  all VarCurr (v2581(VarCurr)<->v2583(VarCurr)).
% 121.44/120.44  all VarCurr (v2583(VarCurr)<->v2585(VarCurr)).
% 121.44/120.44  all VarCurr (v2585(VarCurr)<->v4164(VarCurr)|v4165(VarCurr)).
% 121.44/120.44  all VarCurr (-v4165(VarCurr)<->v4166(VarCurr)).
% 121.44/120.44  all VarCurr (v4166(VarCurr)<->v2992(VarCurr)&v3153(VarCurr)).
% 121.44/120.44  all VarCurr (v4164(VarCurr)<->v2587(VarCurr,bitIndex0)|v2810(VarCurr)).
% 121.44/120.44  all VarCurr (v2587(VarCurr,bitIndex0)<->v2770(VarCurr,bitIndex0)).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4152(VarNext)-> (all B (range_1_0(B)-> (v2589(VarNext,B)<->v2589(VarCurr,B)))))).
% 121.44/120.44  all VarNext (v4152(VarNext)-> (all B (range_1_0(B)-> (v2589(VarNext,B)<->v4160(VarNext,B))))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_1_0(B)-> (v4160(VarNext,B)<->v4158(VarCurr,B))))).
% 121.44/120.44  all VarCurr (-v4079(VarCurr)-> (all B (range_1_0(B)-> (v4158(VarCurr,B)<->v2595(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4079(VarCurr)-> (all B (range_1_0(B)-> (v4158(VarCurr,B)<->$F)))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4152(VarNext)<->v4153(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4153(VarNext)<->v4155(VarNext)&v3155(VarNext))).
% 121.44/120.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4155(VarNext)<->v3164(VarNext))).
% 121.44/120.44  all VarCurr (-v4110(VarCurr)& -v4120(VarCurr)& -v4130(VarCurr)-> (all B (range_1_0(B)-> (v2595(VarCurr,B)<->v4141(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4130(VarCurr)-> (all B (range_1_0(B)-> (v2595(VarCurr,B)<->v4131(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4120(VarCurr)-> (all B (range_1_0(B)-> (v2595(VarCurr,B)<->v4121(VarCurr,B))))).
% 121.44/120.44  all VarCurr (v4110(VarCurr)-> (all B (range_1_0(B)-> (v2595(VarCurr,B)<->v4111(VarCurr,B))))).
% 121.44/120.44  all VarCurr (-v4142(VarCurr)& -v4144(VarCurr)& -v4146(VarCurr)-> (all B (range_1_0(B)-> (v4141(VarCurr,B)<->b10(B))))).
% 121.44/120.44  all VarCurr (v4146(VarCurr)-> (all B (range_1_0(B)-> (v4141(VarCurr,B)<->$T)))).
% 121.44/120.44  all VarCurr (v4144(VarCurr)-> (all B (range_1_0(B)-> (v4141(VarCurr,B)<->$F)))).
% 121.44/120.44  all VarCurr (v4142(VarCurr)-> (all B (range_1_0(B)-> (v4141(VarCurr,B)<->b01(B))))).
% 121.44/120.44  all VarCurr (v4148(VarCurr)<-> (v4149(VarCurr,bitIndex1)<->$T)& (v4149(VarCurr,bitIndex0)<->$T)).
% 121.44/120.44  all VarCurr (v4149(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.44  all VarCurr (v4149(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.44  all VarCurr (v4146(VarCurr)<-> (v4147(VarCurr,bitIndex1)<->$T)& (v4147(VarCurr,bitIndex0)<->$F)).
% 121.44/120.44  all VarCurr (v4147(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.44  all VarCurr (v4147(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.44  all VarCurr (v4144(VarCurr)<-> (v4145(VarCurr,bitIndex1)<->$F)& (v4145(VarCurr,bitIndex0)<->$T)).
% 121.44/120.44  all VarCurr (v4145(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.44  all VarCurr (v4145(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.44  all VarCurr (v4142(VarCurr)<-> (v4143(VarCurr,bitIndex1)<->$F)& (v4143(VarCurr,bitIndex0)<->$F)).
% 121.44/120.44  all VarCurr (v4143(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.44  all VarCurr (v4143(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.44  all VarCurr (v4140(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$T)& (v2589(VarCurr,bitIndex0)<->$T)).
% 121.44/120.44  all VarCurr (-v4132(VarCurr)& -v4134(VarCurr)& -v4136(VarCurr)-> (all B (range_1_0(B)-> (v4131(VarCurr,B)<->$T)))).
% 121.44/120.44  all VarCurr (v4136(VarCurr)-> (all B (range_1_0(B)-> (v4131(VarCurr,B)<->b10(B))))).
% 121.44/120.44  all VarCurr (v4134(VarCurr)-> (all B (range_1_0(B)-> (v4131(VarCurr,B)<->b01(B))))).
% 121.44/120.45  all VarCurr (v4132(VarCurr)-> (all B (range_1_0(B)-> (v4131(VarCurr,B)<->$F)))).
% 121.44/120.45  all VarCurr (v4138(VarCurr)<-> (v4139(VarCurr,bitIndex1)<->$T)& (v4139(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4139(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4139(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4136(VarCurr)<-> (v4137(VarCurr,bitIndex1)<->$T)& (v4137(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4137(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4137(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4134(VarCurr)<-> (v4135(VarCurr,bitIndex1)<->$F)& (v4135(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4135(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4135(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4132(VarCurr)<-> (v4133(VarCurr,bitIndex1)<->$F)& (v4133(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4133(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4133(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4130(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$T)& (v2589(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (-v4122(VarCurr)& -v4124(VarCurr)& -v4126(VarCurr)-> (all B (range_1_0(B)-> (v4121(VarCurr,B)<->b10(B))))).
% 121.44/120.45  all VarCurr (v4126(VarCurr)-> (all B (range_1_0(B)-> (v4121(VarCurr,B)<->$T)))).
% 121.44/120.45  all VarCurr (v4124(VarCurr)-> (all B (range_1_0(B)-> (v4121(VarCurr,B)<->$F)))).
% 121.44/120.45  all VarCurr (v4122(VarCurr)-> (all B (range_1_0(B)-> (v4121(VarCurr,B)<->b01(B))))).
% 121.44/120.45  all VarCurr (v4128(VarCurr)<-> (v4129(VarCurr,bitIndex1)<->$T)& (v4129(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4129(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.45  all VarCurr (v4129(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4126(VarCurr)<-> (v4127(VarCurr,bitIndex1)<->$T)& (v4127(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4127(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.45  all VarCurr (v4127(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4124(VarCurr)<-> (v4125(VarCurr,bitIndex1)<->$F)& (v4125(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4125(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.45  all VarCurr (v4125(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4122(VarCurr)<-> (v4123(VarCurr,bitIndex1)<->$F)& (v4123(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4123(VarCurr,bitIndex0)<->v4083(VarCurr)).
% 121.44/120.45  all VarCurr (v4123(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4120(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$F)& (v2589(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (-v4112(VarCurr)& -v4114(VarCurr)& -v4116(VarCurr)-> (all B (range_1_0(B)-> (v4111(VarCurr,B)<->$T)))).
% 121.44/120.45  all VarCurr (v4116(VarCurr)-> (all B (range_1_0(B)-> (v4111(VarCurr,B)<->b10(B))))).
% 121.44/120.45  all VarCurr (v4114(VarCurr)-> (all B (range_1_0(B)-> (v4111(VarCurr,B)<->b01(B))))).
% 121.44/120.45  all VarCurr (v4112(VarCurr)-> (all B (range_1_0(B)-> (v4111(VarCurr,B)<->$F)))).
% 121.44/120.45  all VarCurr (v4118(VarCurr)<-> (v4119(VarCurr,bitIndex1)<->$T)& (v4119(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4119(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4119(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4116(VarCurr)<-> (v4117(VarCurr,bitIndex1)<->$T)& (v4117(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4117(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4117(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4114(VarCurr)<-> (v4115(VarCurr,bitIndex1)<->$F)& (v4115(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4115(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4115(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4112(VarCurr)<-> (v4113(VarCurr,bitIndex1)<->$F)& (v4113(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4113(VarCurr,bitIndex0)<->v3359(VarCurr)).
% 121.44/120.45  all VarCurr (v4113(VarCurr,bitIndex1)<->v2597(VarCurr)).
% 121.44/120.45  all VarCurr (v4110(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$F)& (v2589(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4098(VarNext)-> (v4083(VarNext)<->v4083(VarCurr)))).
% 121.44/120.45  all VarNext (v4098(VarNext)-> (v4083(VarNext)<->v4106(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4106(VarNext)<->v4104(VarCurr))).
% 121.44/120.45  all VarCurr (-v4079(VarCurr)-> (v4104(VarCurr)<->v4085(VarCurr))).
% 121.44/120.45  all VarCurr (v4079(VarCurr)-> (v4104(VarCurr)<->$F)).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4098(VarNext)<->v4099(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4099(VarNext)<->v4101(VarNext)&v3155(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4101(VarNext)<->v3164(VarNext))).
% 121.44/120.45  v4083(constB0)<->$F.
% 121.44/120.45  all VarCurr (v4085(VarCurr)<->v4087(VarCurr)).
% 121.44/120.45  all VarCurr (v4087(VarCurr)<->v4089(VarCurr)).
% 121.44/120.45  all VarCurr (-v4093(VarCurr)-> (v4089(VarCurr)<->$F)).
% 121.44/120.45  all VarCurr (v4093(VarCurr)-> (v4089(VarCurr)<->$T)).
% 121.44/120.45  all VarCurr (v4093(VarCurr)<->v4094(VarCurr)|v4095(VarCurr)).
% 121.44/120.45  all VarCurr (v4095(VarCurr)<-> (v4091(VarCurr,bitIndex2)<->$T)& (v4091(VarCurr,bitIndex1)<->$T)& (v4091(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4094(VarCurr)<-> (v4091(VarCurr,bitIndex2)<->$T)& (v4091(VarCurr,bitIndex1)<->$T)& (v4091(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all B (range_2_0(B)-> (v4091(constB0,B)<->$F)).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4070(VarNext)-> (v3359(VarNext)<->v3359(VarCurr)))).
% 121.44/120.45  all VarNext (v4070(VarNext)-> (v3359(VarNext)<->v4078(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4078(VarNext)<->v4076(VarCurr))).
% 121.44/120.45  all VarCurr (-v4079(VarCurr)-> (v4076(VarCurr)<->v3361(VarCurr))).
% 121.44/120.45  all VarCurr (v4079(VarCurr)-> (v4076(VarCurr)<->$F)).
% 121.44/120.45  all VarCurr (-v4079(VarCurr)<->v2591(VarCurr)).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4070(VarNext)<->v4071(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v4071(VarNext)<->v4072(VarNext)&v3155(VarNext))).
% 121.44/120.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v4072(VarNext)<->v3164(VarNext))).
% 121.44/120.45  v3359(constB0)<->$F.
% 121.44/120.45  all VarCurr (-v4058(VarCurr)& -v4061(VarCurr)-> (v3361(VarCurr)<->v4065(VarCurr))).
% 121.44/120.45  all VarCurr (v4061(VarCurr)-> (v3361(VarCurr)<->$F)).
% 121.44/120.45  all VarCurr (v4058(VarCurr)-> (v3361(VarCurr)<->v4059(VarCurr))).
% 121.44/120.45  all VarCurr (v4065(VarCurr)<->v3387(VarCurr)&v4066(VarCurr)).
% 121.44/120.45  all VarCurr (-v4066(VarCurr)<->v4067(VarCurr)).
% 121.44/120.45  all VarCurr (v4067(VarCurr)<->v3873(VarCurr)|v3963(VarCurr)).
% 121.44/120.45  all VarCurr (v4062(VarCurr)<->v4063(VarCurr)|v4064(VarCurr)).
% 121.44/120.45  all VarCurr (v4064(VarCurr)<-> (v3363(VarCurr,bitIndex1)<->$T)& (v3363(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4063(VarCurr)<-> (v3363(VarCurr,bitIndex1)<->$T)& (v3363(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v4061(VarCurr)<-> (v3363(VarCurr,bitIndex1)<->$F)& (v3363(VarCurr,bitIndex0)<->$T)).
% 121.44/120.45  all VarCurr (v4059(VarCurr)<->v3387(VarCurr)&v4060(VarCurr)).
% 121.44/120.45  all VarCurr (-v4060(VarCurr)<->v3873(VarCurr)).
% 121.44/120.45  all VarCurr (v4058(VarCurr)<-> (v3363(VarCurr,bitIndex1)<->$F)& (v3363(VarCurr,bitIndex0)<->$F)).
% 121.44/120.45  all VarCurr (v3963(VarCurr)<->v3965(VarCurr)).
% 121.44/120.45  all VarCurr (v3965(VarCurr)<->v3967(VarCurr)).
% 121.44/120.45  all VarCurr (v3967(VarCurr)<->v3969(VarCurr)).
% 121.44/120.45  all VarCurr (v3969(VarCurr)<->v3993(VarCurr)|v4026(VarCurr)).
% 121.44/120.45  all VarCurr (v4026(VarCurr)<->v4027(VarCurr)|v4042(VarCurr)).
% 121.44/120.45  all VarCurr (v4042(VarCurr)<->v4043(VarCurr)|v4050(VarCurr)).
% 121.44/120.45  all VarCurr (v4050(VarCurr)<->v4051(VarCurr)|v4054(VarCurr)).
% 121.44/120.45  all VarCurr (v4054(VarCurr)<->v4055(VarCurr)|v4056(VarCurr)).
% 121.44/120.45  all VarCurr (v4056(VarCurr)<->v3998(VarCurr,bitIndex62)|v3998(VarCurr,bitIndex63)).
% 121.44/120.45  all VarCurr (v4055(VarCurr)<->v3998(VarCurr,bitIndex60)|v3998(VarCurr,bitIndex61)).
% 121.44/120.45  all VarCurr (v4051(VarCurr)<->v4052(VarCurr)|v4053(VarCurr)).
% 121.44/120.45  all VarCurr (v4053(VarCurr)<->v3998(VarCurr,bitIndex58)|v3998(VarCurr,bitIndex59)).
% 121.44/120.45  all VarCurr (v4052(VarCurr)<->v3998(VarCurr,bitIndex56)|v3998(VarCurr,bitIndex57)).
% 121.44/120.45  all VarCurr (v4043(VarCurr)<->v4044(VarCurr)|v4047(VarCurr)).
% 121.44/120.45  all VarCurr (v4047(VarCurr)<->v4048(VarCurr)|v4049(VarCurr)).
% 121.44/120.45  all VarCurr (v4049(VarCurr)<->v3998(VarCurr,bitIndex54)|v3998(VarCurr,bitIndex55)).
% 121.44/120.45  all VarCurr (v4048(VarCurr)<->v3998(VarCurr,bitIndex52)|v3998(VarCurr,bitIndex53)).
% 121.44/120.45  all VarCurr (v4044(VarCurr)<->v4045(VarCurr)|v4046(VarCurr)).
% 121.44/120.45  all VarCurr (v4046(VarCurr)<->v3998(VarCurr,bitIndex50)|v3998(VarCurr,bitIndex51)).
% 121.44/120.45  all VarCurr (v4045(VarCurr)<->v3998(VarCurr,bitIndex48)|v3998(VarCurr,bitIndex49)).
% 121.44/120.45  all VarCurr (v4027(VarCurr)<->v4028(VarCurr)|v4035(VarCurr)).
% 121.44/120.45  all VarCurr (v4035(VarCurr)<->v4036(VarCurr)|v4039(VarCurr)).
% 121.44/120.45  all VarCurr (v4039(VarCurr)<->v4040(VarCurr)|v4041(VarCurr)).
% 121.44/120.45  all VarCurr (v4041(VarCurr)<->v3998(VarCurr,bitIndex46)|v3998(VarCurr,bitIndex47)).
% 121.44/120.45  all VarCurr (v4040(VarCurr)<->v3998(VarCurr,bitIndex44)|v3998(VarCurr,bitIndex45)).
% 121.44/120.45  all VarCurr (v4036(VarCurr)<->v4037(VarCurr)|v4038(VarCurr)).
% 121.44/120.45  all VarCurr (v4038(VarCurr)<->v3998(VarCurr,bitIndex42)|v3998(VarCurr,bitIndex43)).
% 121.44/120.45  all VarCurr (v4037(VarCurr)<->v3998(VarCurr,bitIndex40)|v3998(VarCurr,bitIndex41)).
% 121.44/120.45  all VarCurr (v4028(VarCurr)<->v4029(VarCurr)|v4032(VarCurr)).
% 121.44/120.45  all VarCurr (v4032(VarCurr)<->v4033(VarCurr)|v4034(VarCurr)).
% 121.44/120.45  all VarCurr (v4034(VarCurr)<->v3998(VarCurr,bitIndex38)|v3998(VarCurr,bitIndex39)).
% 121.44/120.45  all VarCurr (v4033(VarCurr)<->v3998(VarCurr,bitIndex36)|v3998(VarCurr,bitIndex37)).
% 121.44/120.45  all VarCurr (v4029(VarCurr)<->v4030(VarCurr)|v4031(VarCurr)).
% 121.44/120.45  all VarCurr (v4031(VarCurr)<->v3998(VarCurr,bitIndex34)|v3998(VarCurr,bitIndex35)).
% 121.44/120.45  all VarCurr (v4030(VarCurr)<->v3998(VarCurr,bitIndex32)|v3998(VarCurr,bitIndex33)).
% 121.44/120.45  all VarCurr (v3993(VarCurr)<->v3994(VarCurr)|v4011(VarCurr)).
% 121.44/120.45  all VarCurr (v4011(VarCurr)<->v4012(VarCurr)|v4019(VarCurr)).
% 121.44/120.45  all VarCurr (v4019(VarCurr)<->v4020(VarCurr)|v4023(VarCurr)).
% 121.44/120.45  all VarCurr (v4023(VarCurr)<->v4024(VarCurr)|v4025(VarCurr)).
% 121.44/120.45  all VarCurr (v4025(VarCurr)<->v3998(VarCurr,bitIndex30)|v3998(VarCurr,bitIndex31)).
% 121.44/120.45  all VarCurr (v4024(VarCurr)<->v3998(VarCurr,bitIndex28)|v3998(VarCurr,bitIndex29)).
% 121.44/120.45  all VarCurr (v4020(VarCurr)<->v4021(VarCurr)|v4022(VarCurr)).
% 121.44/120.45  all VarCurr (v4022(VarCurr)<->v3998(VarCurr,bitIndex26)|v3998(VarCurr,bitIndex27)).
% 121.44/120.45  all VarCurr (v4021(VarCurr)<->v3998(VarCurr,bitIndex24)|v3998(VarCurr,bitIndex25)).
% 121.44/120.45  all VarCurr (v4012(VarCurr)<->v4013(VarCurr)|v4016(VarCurr)).
% 121.44/120.45  all VarCurr (v4016(VarCurr)<->v4017(VarCurr)|v4018(VarCurr)).
% 121.44/120.45  all VarCurr (v4018(VarCurr)<->v3998(VarCurr,bitIndex22)|v3998(VarCurr,bitIndex23)).
% 121.44/120.45  all VarCurr (v4017(VarCurr)<->v3998(VarCurr,bitIndex20)|v3998(VarCurr,bitIndex21)).
% 121.44/120.45  all VarCurr (v4013(VarCurr)<->v4014(VarCurr)|v4015(VarCurr)).
% 121.44/120.45  all VarCurr (v4015(VarCurr)<->v3998(VarCurr,bitIndex18)|v3998(VarCurr,bitIndex19)).
% 121.44/120.45  all VarCurr (v4014(VarCurr)<->v3998(VarCurr,bitIndex16)|v3998(VarCurr,bitIndex17)).
% 121.44/120.45  all VarCurr (v3994(VarCurr)<->v3995(VarCurr)|v4004(VarCurr)).
% 121.44/120.46  all VarCurr (v4004(VarCurr)<->v4005(VarCurr)|v4008(VarCurr)).
% 121.44/120.46  all VarCurr (v4008(VarCurr)<->v4009(VarCurr)|v4010(VarCurr)).
% 121.44/120.46  all VarCurr (v4010(VarCurr)<->v3998(VarCurr,bitIndex14)|v3998(VarCurr,bitIndex15)).
% 121.44/120.46  all VarCurr (v4009(VarCurr)<->v3998(VarCurr,bitIndex12)|v3998(VarCurr,bitIndex13)).
% 121.44/120.46  all VarCurr (v4005(VarCurr)<->v4006(VarCurr)|v4007(VarCurr)).
% 121.44/120.46  all VarCurr (v4007(VarCurr)<->v3998(VarCurr,bitIndex10)|v3998(VarCurr,bitIndex11)).
% 121.44/120.46  all VarCurr (v4006(VarCurr)<->v3998(VarCurr,bitIndex8)|v3998(VarCurr,bitIndex9)).
% 121.44/120.46  all VarCurr (v3995(VarCurr)<->v3996(VarCurr)|v4001(VarCurr)).
% 121.44/120.46  all VarCurr (v4001(VarCurr)<->v4002(VarCurr)|v4003(VarCurr)).
% 121.44/120.46  all VarCurr (v4003(VarCurr)<->v3998(VarCurr,bitIndex6)|v3998(VarCurr,bitIndex7)).
% 121.44/120.46  all VarCurr (v4002(VarCurr)<->v3998(VarCurr,bitIndex4)|v3998(VarCurr,bitIndex5)).
% 121.44/120.46  all VarCurr (v3996(VarCurr)<->v3997(VarCurr)|v4000(VarCurr)).
% 121.44/120.46  all VarCurr (v4000(VarCurr)<->v3998(VarCurr,bitIndex2)|v3998(VarCurr,bitIndex3)).
% 121.44/120.46  all VarCurr (v3997(VarCurr)<->v3998(VarCurr,bitIndex0)|v3998(VarCurr,bitIndex1)).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3998(VarCurr,B)<->v3971(VarCurr,B)&v3999(VarCurr,B))).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3999(VarCurr,B)<-> -v3990(VarCurr,B))).
% 121.44/120.46  all B (range_63_0(B)-> (v3990(constB0,B)<->$F)).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3971(VarCurr,B)<->v3973(VarCurr,B)&v3987(VarCurr,B))).
% 121.44/120.46  all B (range_63_0(B)-> (v3987(constB0,B)<->$F)).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3973(VarCurr,B)<->v3975(VarCurr,B)&v3977(VarCurr,B))).
% 121.44/120.46  all B (range_63_0(B)-> (v3975(constB0,B)<->$F)).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 121.44/120.46  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3977(VarCurr,B)<->v3979(VarCurr,B))).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3979(VarCurr,B)<->v3981(VarCurr,B))).
% 121.44/120.46  all VarCurr B (range_63_0(B)-> (v3981(VarCurr,B)<->v3983(VarCurr,B))).
% 121.44/120.46  all B (range_63_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B).
% 121.44/120.46  v3983(constB0,bitIndex63)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex62)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex61)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex60)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex59)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex58)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex57)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex56)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex55)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex54)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex53)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex52)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex51)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex50)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex49)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex48)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex47)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex46)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex45)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex44)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex43)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex42)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex41)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex40)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex39)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex38)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex37)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex36)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex35)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex34)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex33)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex32)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex31)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex30)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex29)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex28)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex27)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex26)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex25)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex24)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex23)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex22)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex21)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex20)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex19)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex18)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex17)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex16)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex15)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex14)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex13)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex12)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex11)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex10)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex9)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex8)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex7)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex6)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex5)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex4)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex3)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex2)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex1)<->$F.
% 121.44/120.46  v3983(constB0,bitIndex0)<->$F.
% 121.44/120.46  all VarCurr (v3873(VarCurr)<->v3875(VarCurr)&v3954(VarCurr)).
% 121.44/120.46  all VarCurr (v3954(VarCurr)<->v3956(VarCurr)).
% 121.44/120.46  all VarCurr (v3956(VarCurr)<->v3958(VarCurr)).
% 121.44/120.46  all VarCurr (v3958(VarCurr)<->v3960(VarCurr)).
% 121.44/120.46  v3960(constB0)<->$F.
% 121.44/120.46  all VarCurr (v3875(VarCurr)<->v3877(VarCurr)).
% 121.44/120.46  all VarCurr (v3877(VarCurr)<->v3879(VarCurr)).
% 121.44/120.46  all VarCurr (v3879(VarCurr)<->v3881(VarCurr)).
% 121.44/120.46  all VarCurr (-v3883(VarCurr)-> (v3881(VarCurr)<->v3952(VarCurr))).
% 121.44/120.46  all VarCurr (v3883(VarCurr)-> (v3881(VarCurr)<->v3948(VarCurr))).
% 121.44/120.46  all VarCurr (v3952(VarCurr)<-> (v3885(VarCurr,bitIndex23)<->v3887(VarCurr,bitIndex23))& (v3885(VarCurr,bitIndex22)<->v3887(VarCurr,bitIndex22))& (v3885(VarCurr,bitIndex21)<->v3887(VarCurr,bitIndex21))& (v3885(VarCurr,bitIndex20)<->v3887(VarCurr,bitIndex20))& (v3885(VarCurr,bitIndex19)<->v3887(VarCurr,bitIndex19))& (v3885(VarCurr,bitIndex18)<->v3887(VarCurr,bitIndex18))& (v3885(VarCurr,bitIndex17)<->v3887(VarCurr,bitIndex17))& (v3885(VarCurr,bitIndex16)<->v3887(VarCurr,bitIndex16))& (v3885(VarCurr,bitIndex15)<->v3887(VarCurr,bitIndex15))& (v3885(VarCurr,bitIndex14)<->v3887(VarCurr,bitIndex14))& (v3885(VarCurr,bitIndex13)<->v3887(VarCurr,bitIndex13))& (v3885(VarCurr,bitIndex12)<->v3887(VarCurr,bitIndex12))& (v3885(VarCurr,bitIndex11)<->v3887(VarCurr,bitIndex11))& (v3885(VarCurr,bitIndex10)<->v3887(VarCurr,bitIndex10))& (v3885(VarCurr,bitIndex9)<->v3887(VarCurr,bitIndex9))& (v3885(VarCurr,bitIndex8)<->v3887(VarCurr,bitIndex8))& (v3885(VarCurr,bitIndex7)<->v3887(VarCurr,bitIndex7))& (v3885(VarCurr,bitIndex6)<->v3887(VarCurr,bitIndex6))& (v3885(VarCurr,bitIndex5)<->v3887(VarCurr,bitIndex5))& (v3885(VarCurr,bitIndex4)<->v3887(VarCurr,bitIndex4))& (v3885(VarCurr,bitIndex3)<->v3887(VarCurr,bitIndex3))& (v3885(VarCurr,bitIndex2)<->v3887(VarCurr,bitIndex2))& (v3885(VarCurr,bitIndex1)<->v3887(VarCurr,bitIndex1))& (v3885(VarCurr,bitIndex0)<->v3887(VarCurr,bitIndex0))).
% 121.44/120.46  all VarCurr (v3948(VarCurr)<->v3949(VarCurr)&v3951(VarCurr)).
% 121.44/120.46  all VarCurr (v3951(VarCurr)<-> (v3936(VarCurr,bitIndex4)<->v3938(VarCurr,bitIndex4))& (v3936(VarCurr,bitIndex3)<->v3938(VarCurr,bitIndex3))& (v3936(VarCurr,bitIndex2)<->v3938(VarCurr,bitIndex2))& (v3936(VarCurr,bitIndex1)<->v3938(VarCurr,bitIndex1))& (v3936(VarCurr,bitIndex0)<->v3938(VarCurr,bitIndex0))).
% 121.44/120.46  all B (range_4_0(B)-> (v3936(constB0,B)<->$F)).
% 121.44/120.46  all VarCurr (v3949(VarCurr)<-> (v3885(VarCurr,bitIndex23)<->v3887(VarCurr,bitIndex23))& (v3885(VarCurr,bitIndex22)<->v3887(VarCurr,bitIndex22))& (v3885(VarCurr,bitIndex21)<->v3887(VarCurr,bitIndex21))& (v3885(VarCurr,bitIndex20)<->v3887(VarCurr,bitIndex20))& (v3885(VarCurr,bitIndex19)<->v3887(VarCurr,bitIndex19))& (v3885(VarCurr,bitIndex18)<->v3887(VarCurr,bitIndex18))& (v3885(VarCurr,bitIndex17)<->v3887(VarCurr,bitIndex17))& (v3885(VarCurr,bitIndex16)<->v3887(VarCurr,bitIndex16))& (v3885(VarCurr,bitIndex15)<->v3887(VarCurr,bitIndex15))& (v3885(VarCurr,bitIndex14)<->v3887(VarCurr,bitIndex14))& (v3885(VarCurr,bitIndex13)<->v3887(VarCurr,bitIndex13))& (v3885(VarCurr,bitIndex12)<->v3887(VarCurr,bitIndex12))& (v3885(VarCurr,bitIndex11)<->v3887(VarCurr,bitIndex11))& (v3885(VarCurr,bitIndex10)<->v3887(VarCurr,bitIndex10))& (v3885(VarCurr,bitIndex9)<->v3887(VarCurr,bitIndex9))& (v3885(VarCurr,bitIndex8)<->v3887(VarCurr,bitIndex8))& (v3885(VarCurr,bitIndex7)<->v3887(VarCurr,bitIndex7))& (v3885(VarCurr,bitIndex6)<->v3887(VarCurr,bitIndex6))& (v3885(VarCurr,bitIndex5)<->v3887(VarCurr,bitIndex5))& (v3885(VarCurr,bitIndex4)<->v3887(VarCurr,bitIndex4))& (v3885(VarCurr,bitIndex3)<->v3887(VarCurr,bitIndex3))& (v3885(VarCurr,bitIndex2)<->v3887(VarCurr,bitIndex2))& (v3885(VarCurr,bitIndex1)<->v3887(VarCurr,bitIndex1))& (v3885(VarCurr,bitIndex0)<->v3887(VarCurr,bitIndex0))).
% 121.44/120.47  all B (range_23_0(B)-> (v3885(constB0,B)<->$F)).
% 121.44/120.47  -b000000000000000000000000(bitIndex23).
% 121.44/120.47  -b000000000000000000000000(bitIndex22).
% 121.44/120.47  -b000000000000000000000000(bitIndex21).
% 121.44/120.47  -b000000000000000000000000(bitIndex20).
% 121.44/120.47  -b000000000000000000000000(bitIndex19).
% 121.44/120.47  -b000000000000000000000000(bitIndex18).
% 121.44/120.47  -b000000000000000000000000(bitIndex17).
% 121.44/120.47  -b000000000000000000000000(bitIndex16).
% 121.44/120.47  -b000000000000000000000000(bitIndex15).
% 121.44/120.47  -b000000000000000000000000(bitIndex14).
% 121.44/120.47  -b000000000000000000000000(bitIndex13).
% 121.44/120.47  -b000000000000000000000000(bitIndex12).
% 121.44/120.47  -b000000000000000000000000(bitIndex11).
% 121.44/120.47  -b000000000000000000000000(bitIndex10).
% 121.44/120.47  -b000000000000000000000000(bitIndex9).
% 121.44/120.47  -b000000000000000000000000(bitIndex8).
% 121.44/120.47  -b000000000000000000000000(bitIndex7).
% 121.44/120.47  -b000000000000000000000000(bitIndex6).
% 121.44/120.47  -b000000000000000000000000(bitIndex5).
% 121.44/120.47  -b000000000000000000000000(bitIndex4).
% 121.44/120.47  -b000000000000000000000000(bitIndex3).
% 121.44/120.47  -b000000000000000000000000(bitIndex2).
% 121.44/120.47  -b000000000000000000000000(bitIndex1).
% 121.44/120.47  -b000000000000000000000000(bitIndex0).
% 121.44/120.47  all VarCurr B (range_4_0(B)-> (v3938(VarCurr,B)<->v3940(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_4_0(B)-> (v3940(VarCurr,B)<->v3942(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_4_0(B)-> (v3942(VarCurr,B)<->v3944(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_4_0(B)-> (v3944(VarCurr,B)<->v3946(VarCurr,B))).
% 121.44/120.47  all B (range_4_0(B)-> (v3946(constB0,B)<->$F)).
% 121.44/120.47  all VarCurr (-v3932(VarCurr)-> (all B (range_23_3(B)-> (v3887(VarCurr,B)<->v3905(VarCurr,B))))).
% 121.44/120.47  all VarCurr (v3932(VarCurr)-> (v3887(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3887(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3887(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3887(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3887(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3887(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3887(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3887(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3887(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3887(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3887(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3887(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3887(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3887(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3887(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3887(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3887(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3887(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3887(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3887(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3887(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))).
% 121.44/120.47  all VarCurr (-v3932(VarCurr)-> (all B (range_2_0(B)-> (v3887(VarCurr,B)<->v3905(VarCurr,B))))).
% 121.44/120.47  all VarCurr (v3932(VarCurr)-> (all B (range_2_0(B)-> (v3887(VarCurr,B)<->v3933(VarCurr,B))))).
% 121.44/120.47  all VarCurr (-v3889(VarCurr)-> (v3933(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3933(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3933(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.47  all VarCurr (v3889(VarCurr)-> (all B (range_2_0(B)-> (v3933(VarCurr,B)<->$F)))).
% 121.44/120.47  all VarCurr (-v3932(VarCurr)<->v3883(VarCurr)).
% 121.44/120.47  all VarCurr B (range_23_3(B)-> (v3905(VarCurr,B)<->v3918(VarCurr,B))).
% 121.44/120.47  all B (range_23_3(B)<->bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B).
% 121.44/120.47  all VarCurr B (range_37_17(B)-> (v3897(VarCurr,B)<->v3899(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_37_17(B)-> (v3899(VarCurr,B)<->v3901(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_37_17(B)-> (v3901(VarCurr,B)<->v3903(VarCurr,B))).
% 121.44/120.47  all VarCurr B (range_37_17(B)-> (v3903(VarCurr,B)<->v3474(VarCurr,B))).
% 121.44/120.47  all B (range_37_17(B)<->bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B).
% 121.44/120.47  all VarCurr B (range_2_0(B)-> (v3905(VarCurr,B)<->v3918(VarCurr,B))).
% 121.44/120.47  all VarCurr (-v3919(VarCurr)& -v3920(VarCurr)& -v3922(VarCurr)& -v3923(VarCurr)& -v3925(VarCurr)& -v3926(VarCurr)& -v3928(VarCurr)-> (v3918(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3918(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3918(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3918(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3918(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3918(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3918(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3918(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3918(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3918(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3918(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3918(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3918(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3918(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3918(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3918(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3918(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3918(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3918(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3918(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3918(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))& (v3918(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3918(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3918(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.47  all VarCurr (v3928(VarCurr)-> (v3918(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3918(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3918(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3918(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3918(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3918(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3918(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3918(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3918(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3918(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3918(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3918(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3918(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3918(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3918(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3918(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3918(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3918(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3918(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3918(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3918(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))& (v3918(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3918(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3918(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.48  all VarCurr (v3926(VarCurr)-> (all B (range_23_0(B)-> (v3918(VarCurr,B)<->v3927(VarCurr,B))))).
% 121.44/120.48  all VarCurr (v3925(VarCurr)-> (v3918(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3918(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3918(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3918(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3918(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3918(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3918(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3918(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3918(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3918(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3918(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3918(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3918(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3918(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3918(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3918(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3918(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3918(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3918(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3918(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3918(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))& (v3918(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3918(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3918(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.48  all VarCurr (v3923(VarCurr)-> (all B (range_23_0(B)-> (v3918(VarCurr,B)<->v3924(VarCurr,B))))).
% 121.44/120.48  all VarCurr (v3922(VarCurr)-> (v3918(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3918(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3918(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3918(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3918(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3918(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3918(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3918(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3918(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3918(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3918(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3918(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3918(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3918(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3918(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3918(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3918(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3918(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3918(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3918(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3918(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))& (v3918(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3918(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3918(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.48  all VarCurr (v3920(VarCurr)-> (all B (range_23_0(B)-> (v3918(VarCurr,B)<->v3921(VarCurr,B))))).
% 121.44/120.48  all B (range_23_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B).
% 121.44/120.48  all VarCurr (v3919(VarCurr)-> (v3918(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3918(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3918(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3918(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3918(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3918(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3918(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3918(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3918(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3918(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3918(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3918(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3918(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3918(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3918(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3918(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3918(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3918(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3918(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3918(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3918(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))& (v3918(VarCurr,bitIndex2)<->v3897(VarCurr,bitIndex16))& (v3918(VarCurr,bitIndex1)<->v3897(VarCurr,bitIndex15))& (v3918(VarCurr,bitIndex0)<->v3897(VarCurr,bitIndex14))).
% 121.44/120.48  all VarCurr (v3928(VarCurr)<->v3929(VarCurr)|v3930(VarCurr)).
% 121.44/120.48  all VarCurr (v3930(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$T)& (v3907(VarCurr,bitIndex1)<->$T)& (v3907(VarCurr,bitIndex0)<->$T)).
% 121.44/120.48  all VarCurr (v3929(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$T)& (v3907(VarCurr,bitIndex1)<->$T)& (v3907(VarCurr,bitIndex0)<->$F)).
% 121.44/120.48  all VarCurr B (range_14_0(B)-> (v3927(VarCurr,B)<->$F)).
% 121.44/120.48  -b000000000000000(bitIndex14).
% 121.44/120.48  -b000000000000000(bitIndex13).
% 121.44/120.48  -b000000000000000(bitIndex12).
% 121.44/120.48  -b000000000000000(bitIndex11).
% 121.44/120.48  -b000000000000000(bitIndex10).
% 121.44/120.48  -b000000000000000(bitIndex9).
% 121.44/120.48  -b000000000000000(bitIndex8).
% 121.44/120.48  -b000000000000000(bitIndex7).
% 121.44/120.48  -b000000000000000(bitIndex6).
% 121.44/120.48  -b000000000000000(bitIndex5).
% 121.44/120.48  -b000000000000000(bitIndex4).
% 121.44/120.48  -b000000000000000(bitIndex3).
% 121.44/120.48  -b000000000000000(bitIndex2).
% 121.44/120.48  -b000000000000000(bitIndex1).
% 121.44/120.48  -b000000000000000(bitIndex0).
% 121.44/120.48  all VarCurr ((v3927(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3927(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3927(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3927(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3927(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3927(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3927(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3927(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3927(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))).
% 121.44/120.48  all VarCurr (v3926(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$T)& (v3907(VarCurr,bitIndex1)<->$F)& (v3907(VarCurr,bitIndex0)<->$T)).
% 121.44/120.48  all VarCurr (v3925(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$T)& (v3907(VarCurr,bitIndex1)<->$F)& (v3907(VarCurr,bitIndex0)<->$F)).
% 121.44/120.48  all VarCurr B (range_8_0(B)-> (v3924(VarCurr,B)<->$F)).
% 121.44/120.48  all B (range_8_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B).
% 121.44/120.48  -b000000000(bitIndex8).
% 121.44/120.48  -b000000000(bitIndex7).
% 121.44/120.48  -b000000000(bitIndex6).
% 121.44/120.48  -b000000000(bitIndex5).
% 121.44/120.48  -b000000000(bitIndex4).
% 121.44/120.48  -b000000000(bitIndex3).
% 121.44/120.48  -b000000000(bitIndex2).
% 121.44/120.48  -b000000000(bitIndex1).
% 121.44/120.48  -b000000000(bitIndex0).
% 121.44/120.48  all VarCurr ((v3924(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3924(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3924(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3924(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3924(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3924(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3924(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3924(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3924(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3924(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3924(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3924(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3924(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3924(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3924(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))).
% 121.44/120.48  all VarCurr (v3923(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$F)& (v3907(VarCurr,bitIndex1)<->$T)& (v3907(VarCurr,bitIndex0)<->$T)).
% 121.44/120.48  all VarCurr (v3922(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$F)& (v3907(VarCurr,bitIndex1)<->$T)& (v3907(VarCurr,bitIndex0)<->$F)).
% 121.44/120.48  all VarCurr B (range_2_0(B)-> (v3921(VarCurr,B)<->$F)).
% 121.44/120.48  all VarCurr ((v3921(VarCurr,bitIndex23)<->v3897(VarCurr,bitIndex37))& (v3921(VarCurr,bitIndex22)<->v3897(VarCurr,bitIndex36))& (v3921(VarCurr,bitIndex21)<->v3897(VarCurr,bitIndex35))& (v3921(VarCurr,bitIndex20)<->v3897(VarCurr,bitIndex34))& (v3921(VarCurr,bitIndex19)<->v3897(VarCurr,bitIndex33))& (v3921(VarCurr,bitIndex18)<->v3897(VarCurr,bitIndex32))& (v3921(VarCurr,bitIndex17)<->v3897(VarCurr,bitIndex31))& (v3921(VarCurr,bitIndex16)<->v3897(VarCurr,bitIndex30))& (v3921(VarCurr,bitIndex15)<->v3897(VarCurr,bitIndex29))& (v3921(VarCurr,bitIndex14)<->v3897(VarCurr,bitIndex28))& (v3921(VarCurr,bitIndex13)<->v3897(VarCurr,bitIndex27))& (v3921(VarCurr,bitIndex12)<->v3897(VarCurr,bitIndex26))& (v3921(VarCurr,bitIndex11)<->v3897(VarCurr,bitIndex25))& (v3921(VarCurr,bitIndex10)<->v3897(VarCurr,bitIndex24))& (v3921(VarCurr,bitIndex9)<->v3897(VarCurr,bitIndex23))& (v3921(VarCurr,bitIndex8)<->v3897(VarCurr,bitIndex22))& (v3921(VarCurr,bitIndex7)<->v3897(VarCurr,bitIndex21))& (v3921(VarCurr,bitIndex6)<->v3897(VarCurr,bitIndex20))& (v3921(VarCurr,bitIndex5)<->v3897(VarCurr,bitIndex19))& (v3921(VarCurr,bitIndex4)<->v3897(VarCurr,bitIndex18))& (v3921(VarCurr,bitIndex3)<->v3897(VarCurr,bitIndex17))).
% 121.44/120.48  all VarCurr (v3920(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$F)& (v3907(VarCurr,bitIndex1)<->$F)& (v3907(VarCurr,bitIndex0)<->$T)).
% 121.44/120.48  all VarCurr (v3919(VarCurr)<-> (v3907(VarCurr,bitIndex2)<->$F)& (v3907(VarCurr,bitIndex1)<->$F)& (v3907(VarCurr,bitIndex0)<->$F)).
% 121.44/120.48  all VarCurr B (range_2_0(B)-> (v3907(VarCurr,B)<->v3909(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_2_0(B)-> (v3909(VarCurr,B)<->v3911(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_2_0(B)-> (v3911(VarCurr,B)<->v3913(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_2_0(B)-> (v3913(VarCurr,B)<->v3915(VarCurr,B))).
% 121.44/120.48  all B (range_2_0(B)-> (v3915(constB0,B)<->$F)).
% 121.44/120.48  all VarCurr B (range_16_14(B)-> (v3897(VarCurr,B)<->v3899(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_16_14(B)-> (v3899(VarCurr,B)<->v3901(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_16_14(B)-> (v3901(VarCurr,B)<->v3903(VarCurr,B))).
% 121.44/120.48  all VarCurr B (range_16_14(B)-> (v3903(VarCurr,B)<->v3474(VarCurr,B))).
% 121.44/120.48  all B (range_16_14(B)<->bitIndex14=B|bitIndex15=B|bitIndex16=B).
% 121.44/120.48  all VarCurr (v3889(VarCurr)<->v3891(VarCurr)).
% 121.44/120.48  all VarCurr (v3891(VarCurr)<->v3893(VarCurr)).
% 121.44/120.48  all VarCurr (v3893(VarCurr)<->v3895(VarCurr)).
% 121.44/120.48  all VarCurr (v3895(VarCurr)<->v3411(VarCurr)).
% 121.44/120.48  all VarCurr (v3883(VarCurr)<->v183(VarCurr)).
% 121.44/120.48  all VarCurr (v3387(VarCurr)<->v3859(VarCurr)&v3861(VarCurr)).
% 121.44/120.48  all VarCurr (-v3861(VarCurr)<->v3862(VarCurr)).
% 121.44/120.48  all VarCurr (v3862(VarCurr)<->v3863(VarCurr)|v3865(VarCurr)).
% 121.44/120.48  all VarCurr (v3865(VarCurr)<->v3866(VarCurr)|v3845(VarCurr)).
% 121.44/120.48  all VarCurr (v3866(VarCurr)<->v3867(VarCurr)|v3548(VarCurr)).
% 121.44/120.48  all VarCurr (v3867(VarCurr)<->v3868(VarCurr)|v3546(VarCurr,bitIndex2)).
% 121.44/120.48  all VarCurr (v3868(VarCurr)<->v3869(VarCurr)|v3546(VarCurr,bitIndex16)).
% 121.44/120.48  all VarCurr (v3869(VarCurr)<->v3546(VarCurr,bitIndex18)|v3546(VarCurr,bitIndex17)).
% 121.44/120.48  -v3546(constB0,bitIndex18).
% 121.44/120.48  -v3546(constB0,bitIndex17).
% 121.44/120.48  -v3546(constB0,bitIndex16).
% 121.44/120.48  -v3546(constB0,bitIndex2).
% 121.44/120.48  -bxx000xxxxxxxxxxxxx0xx(bitIndex18).
% 121.44/120.48  -bxx000xxxxxxxxxxxxx0xx(bitIndex17).
% 121.44/120.48  -bxx000xxxxxxxxxxxxx0xx(bitIndex16).
% 121.44/120.48  -bxx000xxxxxxxxxxxxx0xx(bitIndex2).
% 121.44/120.48  all VarCurr (v3863(VarCurr)<->v3391(VarCurr,bitIndex1)&v3864(VarCurr)).
% 121.44/120.48  all VarCurr (-v3864(VarCurr)<->v3542(VarCurr)).
% 121.44/120.48  all VarCurr (v3859(VarCurr)<->v3389(VarCurr,bitIndex1)&v3860(VarCurr)).
% 121.44/120.48  all VarCurr (-v3860(VarCurr)<->v2587(VarCurr,bitIndex1)).
% 121.44/120.48  all VarCurr (v3845(VarCurr)<->v3856(VarCurr)&v3847(VarCurr)).
% 121.44/120.48  all VarCurr (v3856(VarCurr)<->v3389(VarCurr,bitIndex1)&v3857(VarCurr)).
% 121.44/120.48  all VarCurr (-v3857(VarCurr)<->v2587(VarCurr,bitIndex1)).
% 121.44/120.48  all VarCurr (v3847(VarCurr)<->v3849(VarCurr)).
% 121.44/120.48  all VarCurr (v3849(VarCurr)<->v3851(VarCurr)).
% 121.44/120.48  all VarCurr (v3851(VarCurr)<->v3853(VarCurr)).
% 121.44/120.48  all VarCurr (v3853(VarCurr)<->v3560(VarCurr,bitIndex27)&v3535(VarCurr)).
% 121.44/120.48  all VarCurr (v3548(VarCurr)<->v3842(VarCurr)&v3550(VarCurr)).
% 121.44/120.48  all VarCurr (v3842(VarCurr)<->v3389(VarCurr,bitIndex1)&v3843(VarCurr)).
% 121.44/120.48  all VarCurr (-v3843(VarCurr)<->v2587(VarCurr,bitIndex1)).
% 121.44/120.48  v3389(constB0,bitIndex1)<->$F.
% 121.44/120.48  all VarCurr (v3550(VarCurr)<->v3552(VarCurr)).
% 121.44/120.48  all VarCurr (v3552(VarCurr)<->v3554(VarCurr)).
% 121.44/120.48  all VarCurr (v3554(VarCurr)<->v3556(VarCurr)).
% 121.44/120.48  all VarCurr (v3556(VarCurr)<->v3558(VarCurr)&v3535(VarCurr)).
% 121.44/120.48  all VarCurr (-v3601(VarCurr)-> (v3558(VarCurr)<->$T)).
% 121.44/120.49  all VarCurr (v3601(VarCurr)-> (v3558(VarCurr)<->$F)).
% 121.44/120.49  all VarCurr (v3601(VarCurr)<->v3603(VarCurr)|v3839(VarCurr)).
% 121.44/120.49  all VarCurr (v3839(VarCurr)<->v3836(VarCurr)&v3562(VarCurr,bitIndex26)).
% 121.44/120.49  all VarCurr (v3603(VarCurr)<->v3604(VarCurr)&v3833(VarCurr)).
% 121.44/120.49  all VarCurr (-v3833(VarCurr)<->v3834(VarCurr)).
% 121.44/120.49  all VarCurr (v3834(VarCurr)<->v3835(VarCurr)&v3838(VarCurr)).
% 121.44/120.49  all VarCurr (v3838(VarCurr)<->v3560(VarCurr,bitIndex26)|v3562(VarCurr,bitIndex26)).
% 121.44/120.49  all VarCurr (v3835(VarCurr)<->v3836(VarCurr)|v3837(VarCurr)).
% 121.44/120.49  all VarCurr (-v3837(VarCurr)<->v3562(VarCurr,bitIndex26)).
% 121.44/120.49  all VarCurr (-v3836(VarCurr)<->v3560(VarCurr,bitIndex26)).
% 121.44/120.49  all VarCurr (v3604(VarCurr)<->v3605(VarCurr)|v3832(VarCurr)).
% 121.44/120.49  all VarCurr (v3832(VarCurr)<->v3829(VarCurr)&v3562(VarCurr,bitIndex25)).
% 121.44/120.49  all VarCurr (v3605(VarCurr)<->v3606(VarCurr)&v3826(VarCurr)).
% 121.44/120.49  all VarCurr (-v3826(VarCurr)<->v3827(VarCurr)).
% 121.44/120.49  all VarCurr (v3827(VarCurr)<->v3828(VarCurr)&v3831(VarCurr)).
% 121.44/120.49  all VarCurr (v3831(VarCurr)<->v3560(VarCurr,bitIndex25)|v3562(VarCurr,bitIndex25)).
% 121.44/120.49  all VarCurr (v3828(VarCurr)<->v3829(VarCurr)|v3830(VarCurr)).
% 121.44/120.49  all VarCurr (-v3830(VarCurr)<->v3562(VarCurr,bitIndex25)).
% 121.44/120.49  all VarCurr (-v3829(VarCurr)<->v3560(VarCurr,bitIndex25)).
% 121.44/120.49  all VarCurr (v3606(VarCurr)<->v3607(VarCurr)|v3825(VarCurr)).
% 121.44/120.49  all VarCurr (v3825(VarCurr)<->v3822(VarCurr)&v3562(VarCurr,bitIndex24)).
% 121.44/120.49  all VarCurr (v3607(VarCurr)<->v3608(VarCurr)&v3819(VarCurr)).
% 121.44/120.49  all VarCurr (-v3819(VarCurr)<->v3820(VarCurr)).
% 121.44/120.49  all VarCurr (v3820(VarCurr)<->v3821(VarCurr)&v3824(VarCurr)).
% 121.44/120.49  all VarCurr (v3824(VarCurr)<->v3560(VarCurr,bitIndex24)|v3562(VarCurr,bitIndex24)).
% 121.44/120.49  all VarCurr (v3821(VarCurr)<->v3822(VarCurr)|v3823(VarCurr)).
% 121.44/120.49  all VarCurr (-v3823(VarCurr)<->v3562(VarCurr,bitIndex24)).
% 121.44/120.49  all VarCurr (-v3822(VarCurr)<->v3560(VarCurr,bitIndex24)).
% 121.44/120.49  all VarCurr (v3608(VarCurr)<->v3609(VarCurr)|v3818(VarCurr)).
% 121.44/120.49  all VarCurr (v3818(VarCurr)<->v3815(VarCurr)&v3562(VarCurr,bitIndex23)).
% 121.44/120.49  all VarCurr (v3609(VarCurr)<->v3610(VarCurr)&v3812(VarCurr)).
% 121.44/120.49  all VarCurr (-v3812(VarCurr)<->v3813(VarCurr)).
% 121.44/120.49  all VarCurr (v3813(VarCurr)<->v3814(VarCurr)&v3817(VarCurr)).
% 121.44/120.49  all VarCurr (v3817(VarCurr)<->v3560(VarCurr,bitIndex23)|v3562(VarCurr,bitIndex23)).
% 121.44/120.49  all VarCurr (v3814(VarCurr)<->v3815(VarCurr)|v3816(VarCurr)).
% 121.44/120.49  all VarCurr (-v3816(VarCurr)<->v3562(VarCurr,bitIndex23)).
% 121.44/120.49  all VarCurr (-v3815(VarCurr)<->v3560(VarCurr,bitIndex23)).
% 121.44/120.49  all VarCurr (v3610(VarCurr)<->v3611(VarCurr)|v3811(VarCurr)).
% 121.44/120.49  all VarCurr (v3811(VarCurr)<->v3808(VarCurr)&v3562(VarCurr,bitIndex22)).
% 121.44/120.49  all VarCurr (v3611(VarCurr)<->v3612(VarCurr)&v3805(VarCurr)).
% 121.44/120.49  all VarCurr (-v3805(VarCurr)<->v3806(VarCurr)).
% 121.44/120.49  all VarCurr (v3806(VarCurr)<->v3807(VarCurr)&v3810(VarCurr)).
% 121.44/120.49  all VarCurr (v3810(VarCurr)<->v3560(VarCurr,bitIndex22)|v3562(VarCurr,bitIndex22)).
% 121.44/120.49  all VarCurr (v3807(VarCurr)<->v3808(VarCurr)|v3809(VarCurr)).
% 121.44/120.49  all VarCurr (-v3809(VarCurr)<->v3562(VarCurr,bitIndex22)).
% 121.44/120.49  all VarCurr (-v3808(VarCurr)<->v3560(VarCurr,bitIndex22)).
% 121.44/120.49  all VarCurr (v3612(VarCurr)<->v3613(VarCurr)|v3804(VarCurr)).
% 121.44/120.49  all VarCurr (v3804(VarCurr)<->v3801(VarCurr)&v3562(VarCurr,bitIndex21)).
% 121.44/120.49  all VarCurr (v3613(VarCurr)<->v3614(VarCurr)&v3798(VarCurr)).
% 121.44/120.49  all VarCurr (-v3798(VarCurr)<->v3799(VarCurr)).
% 121.44/120.49  all VarCurr (v3799(VarCurr)<->v3800(VarCurr)&v3803(VarCurr)).
% 121.44/120.49  all VarCurr (v3803(VarCurr)<->v3560(VarCurr,bitIndex21)|v3562(VarCurr,bitIndex21)).
% 121.44/120.49  all VarCurr (v3800(VarCurr)<->v3801(VarCurr)|v3802(VarCurr)).
% 121.44/120.49  all VarCurr (-v3802(VarCurr)<->v3562(VarCurr,bitIndex21)).
% 121.44/120.49  all VarCurr (-v3801(VarCurr)<->v3560(VarCurr,bitIndex21)).
% 121.44/120.49  all VarCurr (v3614(VarCurr)<->v3615(VarCurr)|v3797(VarCurr)).
% 121.44/120.49  all VarCurr (v3797(VarCurr)<->v3794(VarCurr)&v3562(VarCurr,bitIndex20)).
% 121.44/120.49  all VarCurr (v3615(VarCurr)<->v3616(VarCurr)&v3791(VarCurr)).
% 121.44/120.49  all VarCurr (-v3791(VarCurr)<->v3792(VarCurr)).
% 121.44/120.49  all VarCurr (v3792(VarCurr)<->v3793(VarCurr)&v3796(VarCurr)).
% 121.44/120.49  all VarCurr (v3796(VarCurr)<->v3560(VarCurr,bitIndex20)|v3562(VarCurr,bitIndex20)).
% 121.44/120.49  all VarCurr (v3793(VarCurr)<->v3794(VarCurr)|v3795(VarCurr)).
% 121.44/120.49  all VarCurr (-v3795(VarCurr)<->v3562(VarCurr,bitIndex20)).
% 121.44/120.49  all VarCurr (-v3794(VarCurr)<->v3560(VarCurr,bitIndex20)).
% 121.44/120.49  all VarCurr (v3616(VarCurr)<->v3617(VarCurr)|v3790(VarCurr)).
% 121.44/120.49  all VarCurr (v3790(VarCurr)<->v3787(VarCurr)&v3562(VarCurr,bitIndex19)).
% 121.44/120.49  all VarCurr (v3617(VarCurr)<->v3618(VarCurr)&v3784(VarCurr)).
% 121.44/120.49  all VarCurr (-v3784(VarCurr)<->v3785(VarCurr)).
% 121.44/120.49  all VarCurr (v3785(VarCurr)<->v3786(VarCurr)&v3789(VarCurr)).
% 121.44/120.49  all VarCurr (v3789(VarCurr)<->v3560(VarCurr,bitIndex19)|v3562(VarCurr,bitIndex19)).
% 121.44/120.49  all VarCurr (v3786(VarCurr)<->v3787(VarCurr)|v3788(VarCurr)).
% 121.44/120.49  all VarCurr (-v3788(VarCurr)<->v3562(VarCurr,bitIndex19)).
% 121.44/120.49  all VarCurr (-v3787(VarCurr)<->v3560(VarCurr,bitIndex19)).
% 121.44/120.49  all VarCurr (v3618(VarCurr)<->v3619(VarCurr)|v3783(VarCurr)).
% 121.44/120.49  all VarCurr (v3783(VarCurr)<->v3780(VarCurr)&v3562(VarCurr,bitIndex18)).
% 121.44/120.49  all VarCurr (v3619(VarCurr)<->v3620(VarCurr)&v3777(VarCurr)).
% 121.44/120.49  all VarCurr (-v3777(VarCurr)<->v3778(VarCurr)).
% 121.44/120.49  all VarCurr (v3778(VarCurr)<->v3779(VarCurr)&v3782(VarCurr)).
% 121.44/120.49  all VarCurr (v3782(VarCurr)<->v3560(VarCurr,bitIndex18)|v3562(VarCurr,bitIndex18)).
% 121.44/120.49  all VarCurr (v3779(VarCurr)<->v3780(VarCurr)|v3781(VarCurr)).
% 121.44/120.49  all VarCurr (-v3781(VarCurr)<->v3562(VarCurr,bitIndex18)).
% 121.44/120.49  all VarCurr (-v3780(VarCurr)<->v3560(VarCurr,bitIndex18)).
% 121.44/120.49  all VarCurr (v3620(VarCurr)<->v3621(VarCurr)|v3776(VarCurr)).
% 121.44/120.49  all VarCurr (v3776(VarCurr)<->v3773(VarCurr)&v3562(VarCurr,bitIndex17)).
% 121.44/120.49  all VarCurr (v3621(VarCurr)<->v3622(VarCurr)&v3770(VarCurr)).
% 121.44/120.49  all VarCurr (-v3770(VarCurr)<->v3771(VarCurr)).
% 121.44/120.49  all VarCurr (v3771(VarCurr)<->v3772(VarCurr)&v3775(VarCurr)).
% 121.44/120.49  all VarCurr (v3775(VarCurr)<->v3560(VarCurr,bitIndex17)|v3562(VarCurr,bitIndex17)).
% 121.44/120.49  all VarCurr (v3772(VarCurr)<->v3773(VarCurr)|v3774(VarCurr)).
% 121.44/120.49  all VarCurr (-v3774(VarCurr)<->v3562(VarCurr,bitIndex17)).
% 121.44/120.49  all VarCurr (-v3773(VarCurr)<->v3560(VarCurr,bitIndex17)).
% 121.44/120.49  all VarCurr (v3622(VarCurr)<->v3623(VarCurr)|v3769(VarCurr)).
% 121.44/120.49  all VarCurr (v3769(VarCurr)<->v3766(VarCurr)&v3562(VarCurr,bitIndex16)).
% 121.44/120.49  all VarCurr (v3623(VarCurr)<->v3624(VarCurr)&v3763(VarCurr)).
% 121.44/120.49  all VarCurr (-v3763(VarCurr)<->v3764(VarCurr)).
% 121.44/120.49  all VarCurr (v3764(VarCurr)<->v3765(VarCurr)&v3768(VarCurr)).
% 121.44/120.49  all VarCurr (v3768(VarCurr)<->v3560(VarCurr,bitIndex16)|v3562(VarCurr,bitIndex16)).
% 121.44/120.49  all VarCurr (v3765(VarCurr)<->v3766(VarCurr)|v3767(VarCurr)).
% 121.44/120.49  all VarCurr (-v3767(VarCurr)<->v3562(VarCurr,bitIndex16)).
% 121.44/120.49  all VarCurr (-v3766(VarCurr)<->v3560(VarCurr,bitIndex16)).
% 121.44/120.49  all VarCurr (v3624(VarCurr)<->v3625(VarCurr)|v3762(VarCurr)).
% 121.44/120.49  all VarCurr (v3762(VarCurr)<->v3759(VarCurr)&v3562(VarCurr,bitIndex15)).
% 121.44/120.49  all VarCurr (v3625(VarCurr)<->v3626(VarCurr)&v3756(VarCurr)).
% 121.44/120.49  all VarCurr (-v3756(VarCurr)<->v3757(VarCurr)).
% 121.44/120.49  all VarCurr (v3757(VarCurr)<->v3758(VarCurr)&v3761(VarCurr)).
% 121.44/120.49  all VarCurr (v3761(VarCurr)<->v3560(VarCurr,bitIndex15)|v3562(VarCurr,bitIndex15)).
% 121.44/120.49  all VarCurr (v3758(VarCurr)<->v3759(VarCurr)|v3760(VarCurr)).
% 121.44/120.49  all VarCurr (-v3760(VarCurr)<->v3562(VarCurr,bitIndex15)).
% 121.44/120.49  all VarCurr (-v3759(VarCurr)<->v3560(VarCurr,bitIndex15)).
% 121.44/120.49  all VarCurr (v3626(VarCurr)<->v3627(VarCurr)|v3755(VarCurr)).
% 121.44/120.49  all VarCurr (v3755(VarCurr)<->v3752(VarCurr)&v3562(VarCurr,bitIndex14)).
% 121.44/120.49  all VarCurr (v3627(VarCurr)<->v3628(VarCurr)&v3749(VarCurr)).
% 121.44/120.49  all VarCurr (-v3749(VarCurr)<->v3750(VarCurr)).
% 121.44/120.49  all VarCurr (v3750(VarCurr)<->v3751(VarCurr)&v3754(VarCurr)).
% 121.44/120.49  all VarCurr (v3754(VarCurr)<->v3560(VarCurr,bitIndex14)|v3562(VarCurr,bitIndex14)).
% 121.44/120.49  all VarCurr (v3751(VarCurr)<->v3752(VarCurr)|v3753(VarCurr)).
% 121.44/120.49  all VarCurr (-v3753(VarCurr)<->v3562(VarCurr,bitIndex14)).
% 121.44/120.49  all VarCurr (-v3752(VarCurr)<->v3560(VarCurr,bitIndex14)).
% 121.44/120.49  all VarCurr (v3628(VarCurr)<->v3629(VarCurr)|v3748(VarCurr)).
% 121.44/120.49  all VarCurr (v3748(VarCurr)<->v3745(VarCurr)&v3562(VarCurr,bitIndex13)).
% 121.44/120.49  all VarCurr (v3629(VarCurr)<->v3630(VarCurr)&v3742(VarCurr)).
% 121.44/120.49  all VarCurr (-v3742(VarCurr)<->v3743(VarCurr)).
% 121.44/120.49  all VarCurr (v3743(VarCurr)<->v3744(VarCurr)&v3747(VarCurr)).
% 121.44/120.49  all VarCurr (v3747(VarCurr)<->v3560(VarCurr,bitIndex13)|v3562(VarCurr,bitIndex13)).
% 121.44/120.49  all VarCurr (v3744(VarCurr)<->v3745(VarCurr)|v3746(VarCurr)).
% 121.44/120.49  all VarCurr (-v3746(VarCurr)<->v3562(VarCurr,bitIndex13)).
% 121.44/120.49  all VarCurr (-v3745(VarCurr)<->v3560(VarCurr,bitIndex13)).
% 121.44/120.49  all VarCurr (v3630(VarCurr)<->v3631(VarCurr)|v3741(VarCurr)).
% 121.44/120.49  all VarCurr (v3741(VarCurr)<->v3738(VarCurr)&v3562(VarCurr,bitIndex12)).
% 121.44/120.50  all VarCurr (v3631(VarCurr)<->v3632(VarCurr)&v3735(VarCurr)).
% 121.44/120.50  all VarCurr (-v3735(VarCurr)<->v3736(VarCurr)).
% 121.44/120.50  all VarCurr (v3736(VarCurr)<->v3737(VarCurr)&v3740(VarCurr)).
% 121.44/120.50  all VarCurr (v3740(VarCurr)<->v3560(VarCurr,bitIndex12)|v3562(VarCurr,bitIndex12)).
% 121.44/120.50  all VarCurr (v3737(VarCurr)<->v3738(VarCurr)|v3739(VarCurr)).
% 121.44/120.50  all VarCurr (-v3739(VarCurr)<->v3562(VarCurr,bitIndex12)).
% 121.44/120.50  all VarCurr (-v3738(VarCurr)<->v3560(VarCurr,bitIndex12)).
% 121.44/120.50  all VarCurr (v3632(VarCurr)<->v3633(VarCurr)|v3734(VarCurr)).
% 121.44/120.50  all VarCurr (v3734(VarCurr)<->v3731(VarCurr)&v3562(VarCurr,bitIndex11)).
% 121.44/120.50  all VarCurr (v3633(VarCurr)<->v3634(VarCurr)&v3728(VarCurr)).
% 121.44/120.50  all VarCurr (-v3728(VarCurr)<->v3729(VarCurr)).
% 121.44/120.50  all VarCurr (v3729(VarCurr)<->v3730(VarCurr)&v3733(VarCurr)).
% 121.44/120.50  all VarCurr (v3733(VarCurr)<->v3560(VarCurr,bitIndex11)|v3562(VarCurr,bitIndex11)).
% 121.44/120.50  all VarCurr (v3730(VarCurr)<->v3731(VarCurr)|v3732(VarCurr)).
% 121.44/120.50  all VarCurr (-v3732(VarCurr)<->v3562(VarCurr,bitIndex11)).
% 121.44/120.50  all VarCurr (-v3731(VarCurr)<->v3560(VarCurr,bitIndex11)).
% 121.44/120.50  all VarCurr (v3634(VarCurr)<->v3635(VarCurr)|v3727(VarCurr)).
% 121.44/120.50  all VarCurr (v3727(VarCurr)<->v3724(VarCurr)&v3562(VarCurr,bitIndex10)).
% 121.44/120.50  all VarCurr (v3635(VarCurr)<->v3636(VarCurr)&v3721(VarCurr)).
% 121.44/120.50  all VarCurr (-v3721(VarCurr)<->v3722(VarCurr)).
% 121.44/120.50  all VarCurr (v3722(VarCurr)<->v3723(VarCurr)&v3726(VarCurr)).
% 121.44/120.50  all VarCurr (v3726(VarCurr)<->v3560(VarCurr,bitIndex10)|v3562(VarCurr,bitIndex10)).
% 121.44/120.50  all VarCurr (v3723(VarCurr)<->v3724(VarCurr)|v3725(VarCurr)).
% 121.44/120.50  all VarCurr (-v3725(VarCurr)<->v3562(VarCurr,bitIndex10)).
% 121.44/120.50  all VarCurr (-v3724(VarCurr)<->v3560(VarCurr,bitIndex10)).
% 121.44/120.50  all VarCurr (v3636(VarCurr)<->v3637(VarCurr)|v3720(VarCurr)).
% 121.44/120.50  all VarCurr (v3720(VarCurr)<->v3717(VarCurr)&v3562(VarCurr,bitIndex9)).
% 121.44/120.50  all VarCurr (v3637(VarCurr)<->v3638(VarCurr)&v3714(VarCurr)).
% 121.44/120.50  all VarCurr (-v3714(VarCurr)<->v3715(VarCurr)).
% 121.44/120.50  all VarCurr (v3715(VarCurr)<->v3716(VarCurr)&v3719(VarCurr)).
% 121.44/120.50  all VarCurr (v3719(VarCurr)<->v3560(VarCurr,bitIndex9)|v3562(VarCurr,bitIndex9)).
% 121.44/120.50  all VarCurr (v3716(VarCurr)<->v3717(VarCurr)|v3718(VarCurr)).
% 121.44/120.50  all VarCurr (-v3718(VarCurr)<->v3562(VarCurr,bitIndex9)).
% 121.44/120.50  all VarCurr (-v3717(VarCurr)<->v3560(VarCurr,bitIndex9)).
% 121.44/120.50  all VarCurr (v3638(VarCurr)<->v3639(VarCurr)|v3713(VarCurr)).
% 121.44/120.50  all VarCurr (v3713(VarCurr)<->v3710(VarCurr)&v3562(VarCurr,bitIndex8)).
% 121.44/120.50  all VarCurr (v3639(VarCurr)<->v3640(VarCurr)&v3707(VarCurr)).
% 121.44/120.50  all VarCurr (-v3707(VarCurr)<->v3708(VarCurr)).
% 121.44/120.50  all VarCurr (v3708(VarCurr)<->v3709(VarCurr)&v3712(VarCurr)).
% 121.44/120.50  all VarCurr (v3712(VarCurr)<->v3560(VarCurr,bitIndex8)|v3562(VarCurr,bitIndex8)).
% 121.44/120.50  all VarCurr (v3709(VarCurr)<->v3710(VarCurr)|v3711(VarCurr)).
% 121.44/120.50  all VarCurr (-v3711(VarCurr)<->v3562(VarCurr,bitIndex8)).
% 121.44/120.50  all VarCurr (-v3710(VarCurr)<->v3560(VarCurr,bitIndex8)).
% 121.44/120.50  all VarCurr (v3640(VarCurr)<->v3641(VarCurr)|v3706(VarCurr)).
% 121.44/120.50  all VarCurr (v3706(VarCurr)<->v3703(VarCurr)&v3562(VarCurr,bitIndex7)).
% 121.44/120.50  all VarCurr (v3641(VarCurr)<->v3642(VarCurr)&v3700(VarCurr)).
% 121.44/120.50  all VarCurr (-v3700(VarCurr)<->v3701(VarCurr)).
% 121.44/120.50  all VarCurr (v3701(VarCurr)<->v3702(VarCurr)&v3705(VarCurr)).
% 121.44/120.50  all VarCurr (v3705(VarCurr)<->v3560(VarCurr,bitIndex7)|v3562(VarCurr,bitIndex7)).
% 121.44/120.50  all VarCurr (v3702(VarCurr)<->v3703(VarCurr)|v3704(VarCurr)).
% 121.44/120.50  all VarCurr (-v3704(VarCurr)<->v3562(VarCurr,bitIndex7)).
% 121.44/120.50  all VarCurr (-v3703(VarCurr)<->v3560(VarCurr,bitIndex7)).
% 121.44/120.50  all VarCurr (v3642(VarCurr)<->v3643(VarCurr)|v3699(VarCurr)).
% 121.44/120.50  all VarCurr (v3699(VarCurr)<->v3696(VarCurr)&v3562(VarCurr,bitIndex6)).
% 121.44/120.50  all VarCurr (v3643(VarCurr)<->v3644(VarCurr)&v3693(VarCurr)).
% 121.44/120.50  all VarCurr (-v3693(VarCurr)<->v3694(VarCurr)).
% 121.44/120.50  all VarCurr (v3694(VarCurr)<->v3695(VarCurr)&v3698(VarCurr)).
% 121.44/120.50  all VarCurr (v3698(VarCurr)<->v3560(VarCurr,bitIndex6)|v3562(VarCurr,bitIndex6)).
% 121.44/120.50  all VarCurr (v3695(VarCurr)<->v3696(VarCurr)|v3697(VarCurr)).
% 121.44/120.50  all VarCurr (-v3697(VarCurr)<->v3562(VarCurr,bitIndex6)).
% 121.44/120.50  all VarCurr (-v3696(VarCurr)<->v3560(VarCurr,bitIndex6)).
% 121.44/120.50  all VarCurr (v3644(VarCurr)<->v3645(VarCurr)|v3692(VarCurr)).
% 121.44/120.50  all VarCurr (v3692(VarCurr)<->v3689(VarCurr)&v3562(VarCurr,bitIndex5)).
% 121.44/120.50  all VarCurr (v3645(VarCurr)<->v3646(VarCurr)&v3686(VarCurr)).
% 121.44/120.50  all VarCurr (-v3686(VarCurr)<->v3687(VarCurr)).
% 121.44/120.50  all VarCurr (v3687(VarCurr)<->v3688(VarCurr)&v3691(VarCurr)).
% 121.44/120.50  all VarCurr (v3691(VarCurr)<->v3560(VarCurr,bitIndex5)|v3562(VarCurr,bitIndex5)).
% 121.44/120.50  all VarCurr (v3688(VarCurr)<->v3689(VarCurr)|v3690(VarCurr)).
% 121.44/120.50  all VarCurr (-v3690(VarCurr)<->v3562(VarCurr,bitIndex5)).
% 121.44/120.50  all VarCurr (-v3689(VarCurr)<->v3560(VarCurr,bitIndex5)).
% 121.44/120.50  all VarCurr (v3646(VarCurr)<->v3647(VarCurr)|v3685(VarCurr)).
% 121.44/120.50  all VarCurr (v3685(VarCurr)<->v3682(VarCurr)&v3562(VarCurr,bitIndex4)).
% 121.44/120.50  all VarCurr (v3647(VarCurr)<->v3648(VarCurr)&v3679(VarCurr)).
% 121.44/120.50  all VarCurr (-v3679(VarCurr)<->v3680(VarCurr)).
% 121.44/120.50  all VarCurr (v3680(VarCurr)<->v3681(VarCurr)&v3684(VarCurr)).
% 121.44/120.50  all VarCurr (v3684(VarCurr)<->v3560(VarCurr,bitIndex4)|v3562(VarCurr,bitIndex4)).
% 121.44/120.50  all VarCurr (v3681(VarCurr)<->v3682(VarCurr)|v3683(VarCurr)).
% 121.44/120.50  all VarCurr (-v3683(VarCurr)<->v3562(VarCurr,bitIndex4)).
% 121.44/120.50  all VarCurr (-v3682(VarCurr)<->v3560(VarCurr,bitIndex4)).
% 121.44/120.50  all VarCurr (v3648(VarCurr)<->v3649(VarCurr)|v3678(VarCurr)).
% 121.44/120.50  all VarCurr (v3678(VarCurr)<->v3675(VarCurr)&v3562(VarCurr,bitIndex3)).
% 121.44/120.50  all VarCurr (v3649(VarCurr)<->v3650(VarCurr)&v3672(VarCurr)).
% 121.44/120.50  all VarCurr (-v3672(VarCurr)<->v3673(VarCurr)).
% 121.44/120.50  all VarCurr (v3673(VarCurr)<->v3674(VarCurr)&v3677(VarCurr)).
% 121.44/120.50  all VarCurr (v3677(VarCurr)<->v3560(VarCurr,bitIndex3)|v3562(VarCurr,bitIndex3)).
% 121.44/120.50  all VarCurr (v3674(VarCurr)<->v3675(VarCurr)|v3676(VarCurr)).
% 121.44/120.50  all VarCurr (-v3676(VarCurr)<->v3562(VarCurr,bitIndex3)).
% 121.44/120.50  all VarCurr (-v3675(VarCurr)<->v3560(VarCurr,bitIndex3)).
% 121.44/120.50  all VarCurr (v3650(VarCurr)<->v3651(VarCurr)|v3671(VarCurr)).
% 121.44/120.50  all VarCurr (v3671(VarCurr)<->v3668(VarCurr)&v3562(VarCurr,bitIndex2)).
% 121.44/120.50  all VarCurr (v3651(VarCurr)<->v3652(VarCurr)&v3665(VarCurr)).
% 121.44/120.50  all VarCurr (-v3665(VarCurr)<->v3666(VarCurr)).
% 121.44/120.50  all VarCurr (v3666(VarCurr)<->v3667(VarCurr)&v3670(VarCurr)).
% 121.44/120.50  all VarCurr (v3670(VarCurr)<->v3560(VarCurr,bitIndex2)|v3562(VarCurr,bitIndex2)).
% 121.44/120.50  all VarCurr (v3667(VarCurr)<->v3668(VarCurr)|v3669(VarCurr)).
% 121.44/120.50  all VarCurr (-v3669(VarCurr)<->v3562(VarCurr,bitIndex2)).
% 121.44/120.50  all VarCurr (-v3668(VarCurr)<->v3560(VarCurr,bitIndex2)).
% 121.44/120.50  all VarCurr (v3652(VarCurr)<->v3653(VarCurr)|v3664(VarCurr)).
% 121.44/120.50  all VarCurr (v3664(VarCurr)<->v3661(VarCurr)&v3562(VarCurr,bitIndex1)).
% 121.44/120.50  all VarCurr (v3653(VarCurr)<->v3654(VarCurr)&v3658(VarCurr)).
% 121.44/120.50  all VarCurr (-v3658(VarCurr)<->v3659(VarCurr)).
% 121.44/120.50  all VarCurr (v3659(VarCurr)<->v3660(VarCurr)&v3663(VarCurr)).
% 121.44/120.50  all VarCurr (v3663(VarCurr)<->v3560(VarCurr,bitIndex1)|v3562(VarCurr,bitIndex1)).
% 121.44/120.50  all VarCurr (v3660(VarCurr)<->v3661(VarCurr)|v3662(VarCurr)).
% 121.44/120.50  all VarCurr (-v3662(VarCurr)<->v3562(VarCurr,bitIndex1)).
% 121.44/120.50  all VarCurr (-v3661(VarCurr)<->v3560(VarCurr,bitIndex1)).
% 121.44/120.50  all VarCurr (v3654(VarCurr)<->v3655(VarCurr)&v3562(VarCurr,bitIndex0)).
% 121.44/120.50  all VarCurr (-v3655(VarCurr)<->v3560(VarCurr,bitIndex0)).
% 121.44/120.50  -v3560(constB0,bitIndex27).
% 121.44/120.50  -v3560(constB0,bitIndex26).
% 121.44/120.50  -v3560(constB0,bitIndex25).
% 121.44/120.50  -v3560(constB0,bitIndex24).
% 121.44/120.50  -v3560(constB0,bitIndex23).
% 121.44/120.50  -v3560(constB0,bitIndex22).
% 121.44/120.50  -v3560(constB0,bitIndex21).
% 121.44/120.50  -v3560(constB0,bitIndex20).
% 121.44/120.50  -v3560(constB0,bitIndex19).
% 121.44/120.50  -v3560(constB0,bitIndex18).
% 121.44/120.50  -v3560(constB0,bitIndex17).
% 121.44/120.50  -v3560(constB0,bitIndex16).
% 121.44/120.50  -v3560(constB0,bitIndex15).
% 121.44/120.51  -v3560(constB0,bitIndex14).
% 121.44/120.51  -v3560(constB0,bitIndex13).
% 121.44/120.51  -v3560(constB0,bitIndex12).
% 121.44/120.51  -v3560(constB0,bitIndex11).
% 121.44/120.51  -v3560(constB0,bitIndex10).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex27).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex26).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex25).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex24).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex23).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex22).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex21).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex20).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex19).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex18).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex17).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex16).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex15).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex14).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex13).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex12).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex11).
% 121.44/120.51  -b000000000000000000xxxxxxxxxx(bitIndex10).
% 121.44/120.51  all VarCurr (-v3584(VarCurr)& -v3585(VarCurr)& -v3586(VarCurr)& -v3587(VarCurr)& -v3588(VarCurr)& -v3589(VarCurr)& -v3590(VarCurr)& -v3591(VarCurr)& -v3592(VarCurr)& -v3593(VarCurr)& -v3594(VarCurr)& -v3595(VarCurr)& -v3596(VarCurr)& -v3597(VarCurr)& -v3598(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b010000000000000000000000000(B))))).
% 121.54/120.51  -b010000000000000000000000000(bitIndex26).
% 121.54/120.51  b010000000000000000000000000(bitIndex25).
% 121.54/120.51  -b010000000000000000000000000(bitIndex24).
% 121.54/120.51  -b010000000000000000000000000(bitIndex23).
% 121.54/120.51  -b010000000000000000000000000(bitIndex22).
% 121.54/120.51  -b010000000000000000000000000(bitIndex21).
% 121.54/120.51  -b010000000000000000000000000(bitIndex20).
% 121.54/120.51  -b010000000000000000000000000(bitIndex19).
% 121.54/120.51  -b010000000000000000000000000(bitIndex18).
% 121.54/120.51  -b010000000000000000000000000(bitIndex17).
% 121.54/120.51  -b010000000000000000000000000(bitIndex16).
% 121.54/120.51  -b010000000000000000000000000(bitIndex15).
% 121.54/120.51  -b010000000000000000000000000(bitIndex14).
% 121.54/120.51  -b010000000000000000000000000(bitIndex13).
% 121.54/120.51  -b010000000000000000000000000(bitIndex12).
% 121.54/120.51  -b010000000000000000000000000(bitIndex11).
% 121.54/120.51  -b010000000000000000000000000(bitIndex10).
% 121.54/120.51  -b010000000000000000000000000(bitIndex9).
% 121.54/120.51  -b010000000000000000000000000(bitIndex8).
% 121.54/120.51  -b010000000000000000000000000(bitIndex7).
% 121.54/120.51  -b010000000000000000000000000(bitIndex6).
% 121.54/120.51  -b010000000000000000000000000(bitIndex5).
% 121.54/120.51  -b010000000000000000000000000(bitIndex4).
% 121.54/120.51  -b010000000000000000000000000(bitIndex3).
% 121.54/120.51  -b010000000000000000000000000(bitIndex2).
% 121.54/120.51  -b010000000000000000000000000(bitIndex1).
% 121.54/120.51  -b010000000000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3598(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b001000000000000000000000000(B))))).
% 121.54/120.51  -b001000000000000000000000000(bitIndex26).
% 121.54/120.51  -b001000000000000000000000000(bitIndex25).
% 121.54/120.51  b001000000000000000000000000(bitIndex24).
% 121.54/120.51  -b001000000000000000000000000(bitIndex23).
% 121.54/120.51  -b001000000000000000000000000(bitIndex22).
% 121.54/120.51  -b001000000000000000000000000(bitIndex21).
% 121.54/120.51  -b001000000000000000000000000(bitIndex20).
% 121.54/120.51  -b001000000000000000000000000(bitIndex19).
% 121.54/120.51  -b001000000000000000000000000(bitIndex18).
% 121.54/120.51  -b001000000000000000000000000(bitIndex17).
% 121.54/120.51  -b001000000000000000000000000(bitIndex16).
% 121.54/120.51  -b001000000000000000000000000(bitIndex15).
% 121.54/120.51  -b001000000000000000000000000(bitIndex14).
% 121.54/120.51  -b001000000000000000000000000(bitIndex13).
% 121.54/120.51  -b001000000000000000000000000(bitIndex12).
% 121.54/120.51  -b001000000000000000000000000(bitIndex11).
% 121.54/120.51  -b001000000000000000000000000(bitIndex10).
% 121.54/120.51  -b001000000000000000000000000(bitIndex9).
% 121.54/120.51  -b001000000000000000000000000(bitIndex8).
% 121.54/120.51  -b001000000000000000000000000(bitIndex7).
% 121.54/120.51  -b001000000000000000000000000(bitIndex6).
% 121.54/120.51  -b001000000000000000000000000(bitIndex5).
% 121.54/120.51  -b001000000000000000000000000(bitIndex4).
% 121.54/120.51  -b001000000000000000000000000(bitIndex3).
% 121.54/120.51  -b001000000000000000000000000(bitIndex2).
% 121.54/120.51  -b001000000000000000000000000(bitIndex1).
% 121.54/120.51  -b001000000000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3597(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000100000000000000000000000(B))))).
% 121.54/120.51  -b000100000000000000000000000(bitIndex26).
% 121.54/120.51  -b000100000000000000000000000(bitIndex25).
% 121.54/120.51  -b000100000000000000000000000(bitIndex24).
% 121.54/120.51  b000100000000000000000000000(bitIndex23).
% 121.54/120.51  -b000100000000000000000000000(bitIndex22).
% 121.54/120.51  -b000100000000000000000000000(bitIndex21).
% 121.54/120.51  -b000100000000000000000000000(bitIndex20).
% 121.54/120.51  -b000100000000000000000000000(bitIndex19).
% 121.54/120.51  -b000100000000000000000000000(bitIndex18).
% 121.54/120.51  -b000100000000000000000000000(bitIndex17).
% 121.54/120.51  -b000100000000000000000000000(bitIndex16).
% 121.54/120.51  -b000100000000000000000000000(bitIndex15).
% 121.54/120.51  -b000100000000000000000000000(bitIndex14).
% 121.54/120.51  -b000100000000000000000000000(bitIndex13).
% 121.54/120.51  -b000100000000000000000000000(bitIndex12).
% 121.54/120.51  -b000100000000000000000000000(bitIndex11).
% 121.54/120.51  -b000100000000000000000000000(bitIndex10).
% 121.54/120.51  -b000100000000000000000000000(bitIndex9).
% 121.54/120.51  -b000100000000000000000000000(bitIndex8).
% 121.54/120.51  -b000100000000000000000000000(bitIndex7).
% 121.54/120.51  -b000100000000000000000000000(bitIndex6).
% 121.54/120.51  -b000100000000000000000000000(bitIndex5).
% 121.54/120.51  -b000100000000000000000000000(bitIndex4).
% 121.54/120.51  -b000100000000000000000000000(bitIndex3).
% 121.54/120.51  -b000100000000000000000000000(bitIndex2).
% 121.54/120.51  -b000100000000000000000000000(bitIndex1).
% 121.54/120.51  -b000100000000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3596(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000010000000000000000000000(B))))).
% 121.54/120.51  -b000010000000000000000000000(bitIndex26).
% 121.54/120.51  -b000010000000000000000000000(bitIndex25).
% 121.54/120.51  -b000010000000000000000000000(bitIndex24).
% 121.54/120.51  -b000010000000000000000000000(bitIndex23).
% 121.54/120.51  b000010000000000000000000000(bitIndex22).
% 121.54/120.51  -b000010000000000000000000000(bitIndex21).
% 121.54/120.51  -b000010000000000000000000000(bitIndex20).
% 121.54/120.51  -b000010000000000000000000000(bitIndex19).
% 121.54/120.51  -b000010000000000000000000000(bitIndex18).
% 121.54/120.51  -b000010000000000000000000000(bitIndex17).
% 121.54/120.51  -b000010000000000000000000000(bitIndex16).
% 121.54/120.51  -b000010000000000000000000000(bitIndex15).
% 121.54/120.51  -b000010000000000000000000000(bitIndex14).
% 121.54/120.51  -b000010000000000000000000000(bitIndex13).
% 121.54/120.51  -b000010000000000000000000000(bitIndex12).
% 121.54/120.51  -b000010000000000000000000000(bitIndex11).
% 121.54/120.51  -b000010000000000000000000000(bitIndex10).
% 121.54/120.51  -b000010000000000000000000000(bitIndex9).
% 121.54/120.51  -b000010000000000000000000000(bitIndex8).
% 121.54/120.51  -b000010000000000000000000000(bitIndex7).
% 121.54/120.51  -b000010000000000000000000000(bitIndex6).
% 121.54/120.51  -b000010000000000000000000000(bitIndex5).
% 121.54/120.51  -b000010000000000000000000000(bitIndex4).
% 121.54/120.51  -b000010000000000000000000000(bitIndex3).
% 121.54/120.51  -b000010000000000000000000000(bitIndex2).
% 121.54/120.51  -b000010000000000000000000000(bitIndex1).
% 121.54/120.51  -b000010000000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3595(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000001000000000000000000000(B))))).
% 121.54/120.51  -b000001000000000000000000000(bitIndex26).
% 121.54/120.51  -b000001000000000000000000000(bitIndex25).
% 121.54/120.51  -b000001000000000000000000000(bitIndex24).
% 121.54/120.51  -b000001000000000000000000000(bitIndex23).
% 121.54/120.51  -b000001000000000000000000000(bitIndex22).
% 121.54/120.51  b000001000000000000000000000(bitIndex21).
% 121.54/120.51  -b000001000000000000000000000(bitIndex20).
% 121.54/120.51  -b000001000000000000000000000(bitIndex19).
% 121.54/120.51  -b000001000000000000000000000(bitIndex18).
% 121.54/120.51  -b000001000000000000000000000(bitIndex17).
% 121.54/120.51  -b000001000000000000000000000(bitIndex16).
% 121.54/120.51  -b000001000000000000000000000(bitIndex15).
% 121.54/120.51  -b000001000000000000000000000(bitIndex14).
% 121.54/120.51  -b000001000000000000000000000(bitIndex13).
% 121.54/120.51  -b000001000000000000000000000(bitIndex12).
% 121.54/120.51  -b000001000000000000000000000(bitIndex11).
% 121.54/120.51  -b000001000000000000000000000(bitIndex10).
% 121.54/120.51  -b000001000000000000000000000(bitIndex9).
% 121.54/120.51  -b000001000000000000000000000(bitIndex8).
% 121.54/120.51  -b000001000000000000000000000(bitIndex7).
% 121.54/120.51  -b000001000000000000000000000(bitIndex6).
% 121.54/120.51  -b000001000000000000000000000(bitIndex5).
% 121.54/120.51  -b000001000000000000000000000(bitIndex4).
% 121.54/120.51  -b000001000000000000000000000(bitIndex3).
% 121.54/120.51  -b000001000000000000000000000(bitIndex2).
% 121.54/120.51  -b000001000000000000000000000(bitIndex1).
% 121.54/120.51  -b000001000000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3594(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000100000000000000000000(B))))).
% 121.54/120.51  -b000000100000000000000000000(bitIndex26).
% 121.54/120.51  -b000000100000000000000000000(bitIndex25).
% 121.54/120.51  -b000000100000000000000000000(bitIndex24).
% 121.54/120.51  -b000000100000000000000000000(bitIndex23).
% 121.54/120.51  -b000000100000000000000000000(bitIndex22).
% 121.54/120.51  -b000000100000000000000000000(bitIndex21).
% 121.54/120.51  b000000100000000000000000000(bitIndex20).
% 121.54/120.51  -b000000100000000000000000000(bitIndex19).
% 121.54/120.51  -b000000100000000000000000000(bitIndex18).
% 121.54/120.51  -b000000100000000000000000000(bitIndex17).
% 121.54/120.51  -b000000100000000000000000000(bitIndex16).
% 121.54/120.51  -b000000100000000000000000000(bitIndex15).
% 121.54/120.51  -b000000100000000000000000000(bitIndex14).
% 121.54/120.51  -b000000100000000000000000000(bitIndex13).
% 121.54/120.51  -b000000100000000000000000000(bitIndex12).
% 121.54/120.51  -b000000100000000000000000000(bitIndex11).
% 121.54/120.51  -b000000100000000000000000000(bitIndex10).
% 121.54/120.51  -b000000100000000000000000000(bitIndex9).
% 121.54/120.51  -b000000100000000000000000000(bitIndex8).
% 121.54/120.51  -b000000100000000000000000000(bitIndex7).
% 121.54/120.51  -b000000100000000000000000000(bitIndex6).
% 121.54/120.51  -b000000100000000000000000000(bitIndex5).
% 121.54/120.51  -b000000100000000000000000000(bitIndex4).
% 121.54/120.51  -b000000100000000000000000000(bitIndex3).
% 121.54/120.51  -b000000100000000000000000000(bitIndex2).
% 121.54/120.51  -b000000100000000000000000000(bitIndex1).
% 121.54/120.51  -b000000100000000000000000000(bitIndex0).
% 121.54/120.51  all VarCurr (v3593(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000010000000000000000000(B))))).
% 121.54/120.51  -b000000010000000000000000000(bitIndex26).
% 121.54/120.51  -b000000010000000000000000000(bitIndex25).
% 121.54/120.51  -b000000010000000000000000000(bitIndex24).
% 121.54/120.51  -b000000010000000000000000000(bitIndex23).
% 121.54/120.51  -b000000010000000000000000000(bitIndex22).
% 121.54/120.51  -b000000010000000000000000000(bitIndex21).
% 121.54/120.51  -b000000010000000000000000000(bitIndex20).
% 121.54/120.51  b000000010000000000000000000(bitIndex19).
% 121.54/120.51  -b000000010000000000000000000(bitIndex18).
% 121.54/120.52  -b000000010000000000000000000(bitIndex17).
% 121.54/120.52  -b000000010000000000000000000(bitIndex16).
% 121.54/120.52  -b000000010000000000000000000(bitIndex15).
% 121.54/120.52  -b000000010000000000000000000(bitIndex14).
% 121.54/120.52  -b000000010000000000000000000(bitIndex13).
% 121.54/120.52  -b000000010000000000000000000(bitIndex12).
% 121.54/120.52  -b000000010000000000000000000(bitIndex11).
% 121.54/120.52  -b000000010000000000000000000(bitIndex10).
% 121.54/120.52  -b000000010000000000000000000(bitIndex9).
% 121.54/120.52  -b000000010000000000000000000(bitIndex8).
% 121.54/120.52  -b000000010000000000000000000(bitIndex7).
% 121.54/120.52  -b000000010000000000000000000(bitIndex6).
% 121.54/120.52  -b000000010000000000000000000(bitIndex5).
% 121.54/120.52  -b000000010000000000000000000(bitIndex4).
% 121.54/120.52  -b000000010000000000000000000(bitIndex3).
% 121.54/120.52  -b000000010000000000000000000(bitIndex2).
% 121.54/120.52  -b000000010000000000000000000(bitIndex1).
% 121.54/120.52  -b000000010000000000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3592(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000001000000000000000000(B))))).
% 121.54/120.52  -b000000001000000000000000000(bitIndex26).
% 121.54/120.52  -b000000001000000000000000000(bitIndex25).
% 121.54/120.52  -b000000001000000000000000000(bitIndex24).
% 121.54/120.52  -b000000001000000000000000000(bitIndex23).
% 121.54/120.52  -b000000001000000000000000000(bitIndex22).
% 121.54/120.52  -b000000001000000000000000000(bitIndex21).
% 121.54/120.52  -b000000001000000000000000000(bitIndex20).
% 121.54/120.52  -b000000001000000000000000000(bitIndex19).
% 121.54/120.52  b000000001000000000000000000(bitIndex18).
% 121.54/120.52  -b000000001000000000000000000(bitIndex17).
% 121.54/120.52  -b000000001000000000000000000(bitIndex16).
% 121.54/120.52  -b000000001000000000000000000(bitIndex15).
% 121.54/120.52  -b000000001000000000000000000(bitIndex14).
% 121.54/120.52  -b000000001000000000000000000(bitIndex13).
% 121.54/120.52  -b000000001000000000000000000(bitIndex12).
% 121.54/120.52  -b000000001000000000000000000(bitIndex11).
% 121.54/120.52  -b000000001000000000000000000(bitIndex10).
% 121.54/120.52  -b000000001000000000000000000(bitIndex9).
% 121.54/120.52  -b000000001000000000000000000(bitIndex8).
% 121.54/120.52  -b000000001000000000000000000(bitIndex7).
% 121.54/120.52  -b000000001000000000000000000(bitIndex6).
% 121.54/120.52  -b000000001000000000000000000(bitIndex5).
% 121.54/120.52  -b000000001000000000000000000(bitIndex4).
% 121.54/120.52  -b000000001000000000000000000(bitIndex3).
% 121.54/120.52  -b000000001000000000000000000(bitIndex2).
% 121.54/120.52  -b000000001000000000000000000(bitIndex1).
% 121.54/120.52  -b000000001000000000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3591(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000100000000000000000(B))))).
% 121.54/120.52  -b000000000100000000000000000(bitIndex26).
% 121.54/120.52  -b000000000100000000000000000(bitIndex25).
% 121.54/120.52  -b000000000100000000000000000(bitIndex24).
% 121.54/120.52  -b000000000100000000000000000(bitIndex23).
% 121.54/120.52  -b000000000100000000000000000(bitIndex22).
% 121.54/120.52  -b000000000100000000000000000(bitIndex21).
% 121.54/120.52  -b000000000100000000000000000(bitIndex20).
% 121.54/120.52  -b000000000100000000000000000(bitIndex19).
% 121.54/120.52  -b000000000100000000000000000(bitIndex18).
% 121.54/120.52  b000000000100000000000000000(bitIndex17).
% 121.54/120.52  -b000000000100000000000000000(bitIndex16).
% 121.54/120.52  -b000000000100000000000000000(bitIndex15).
% 121.54/120.52  -b000000000100000000000000000(bitIndex14).
% 121.54/120.52  -b000000000100000000000000000(bitIndex13).
% 121.54/120.52  -b000000000100000000000000000(bitIndex12).
% 121.54/120.52  -b000000000100000000000000000(bitIndex11).
% 121.54/120.52  -b000000000100000000000000000(bitIndex10).
% 121.54/120.52  -b000000000100000000000000000(bitIndex9).
% 121.54/120.52  -b000000000100000000000000000(bitIndex8).
% 121.54/120.52  -b000000000100000000000000000(bitIndex7).
% 121.54/120.52  -b000000000100000000000000000(bitIndex6).
% 121.54/120.52  -b000000000100000000000000000(bitIndex5).
% 121.54/120.52  -b000000000100000000000000000(bitIndex4).
% 121.54/120.52  -b000000000100000000000000000(bitIndex3).
% 121.54/120.52  -b000000000100000000000000000(bitIndex2).
% 121.54/120.52  -b000000000100000000000000000(bitIndex1).
% 121.54/120.52  -b000000000100000000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3590(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000010000000000000000(B))))).
% 121.54/120.52  -b000000000010000000000000000(bitIndex26).
% 121.54/120.52  -b000000000010000000000000000(bitIndex25).
% 121.54/120.52  -b000000000010000000000000000(bitIndex24).
% 121.54/120.52  -b000000000010000000000000000(bitIndex23).
% 121.54/120.52  -b000000000010000000000000000(bitIndex22).
% 121.54/120.52  -b000000000010000000000000000(bitIndex21).
% 121.54/120.52  -b000000000010000000000000000(bitIndex20).
% 121.54/120.52  -b000000000010000000000000000(bitIndex19).
% 121.54/120.52  -b000000000010000000000000000(bitIndex18).
% 121.54/120.52  -b000000000010000000000000000(bitIndex17).
% 121.54/120.52  b000000000010000000000000000(bitIndex16).
% 121.54/120.52  -b000000000010000000000000000(bitIndex15).
% 121.54/120.52  -b000000000010000000000000000(bitIndex14).
% 121.54/120.52  -b000000000010000000000000000(bitIndex13).
% 121.54/120.52  -b000000000010000000000000000(bitIndex12).
% 121.54/120.52  -b000000000010000000000000000(bitIndex11).
% 121.54/120.52  -b000000000010000000000000000(bitIndex10).
% 121.54/120.52  -b000000000010000000000000000(bitIndex9).
% 121.54/120.52  -b000000000010000000000000000(bitIndex8).
% 121.54/120.52  -b000000000010000000000000000(bitIndex7).
% 121.54/120.52  -b000000000010000000000000000(bitIndex6).
% 121.54/120.52  -b000000000010000000000000000(bitIndex5).
% 121.54/120.52  -b000000000010000000000000000(bitIndex4).
% 121.54/120.52  -b000000000010000000000000000(bitIndex3).
% 121.54/120.52  -b000000000010000000000000000(bitIndex2).
% 121.54/120.52  -b000000000010000000000000000(bitIndex1).
% 121.54/120.52  -b000000000010000000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3589(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000001000000000000000(B))))).
% 121.54/120.52  -b000000000001000000000000000(bitIndex26).
% 121.54/120.52  -b000000000001000000000000000(bitIndex25).
% 121.54/120.52  -b000000000001000000000000000(bitIndex24).
% 121.54/120.52  -b000000000001000000000000000(bitIndex23).
% 121.54/120.52  -b000000000001000000000000000(bitIndex22).
% 121.54/120.52  -b000000000001000000000000000(bitIndex21).
% 121.54/120.52  -b000000000001000000000000000(bitIndex20).
% 121.54/120.52  -b000000000001000000000000000(bitIndex19).
% 121.54/120.52  -b000000000001000000000000000(bitIndex18).
% 121.54/120.52  -b000000000001000000000000000(bitIndex17).
% 121.54/120.52  -b000000000001000000000000000(bitIndex16).
% 121.54/120.52  b000000000001000000000000000(bitIndex15).
% 121.54/120.52  -b000000000001000000000000000(bitIndex14).
% 121.54/120.52  -b000000000001000000000000000(bitIndex13).
% 121.54/120.52  -b000000000001000000000000000(bitIndex12).
% 121.54/120.52  -b000000000001000000000000000(bitIndex11).
% 121.54/120.52  -b000000000001000000000000000(bitIndex10).
% 121.54/120.52  -b000000000001000000000000000(bitIndex9).
% 121.54/120.52  -b000000000001000000000000000(bitIndex8).
% 121.54/120.52  -b000000000001000000000000000(bitIndex7).
% 121.54/120.52  -b000000000001000000000000000(bitIndex6).
% 121.54/120.52  -b000000000001000000000000000(bitIndex5).
% 121.54/120.52  -b000000000001000000000000000(bitIndex4).
% 121.54/120.52  -b000000000001000000000000000(bitIndex3).
% 121.54/120.52  -b000000000001000000000000000(bitIndex2).
% 121.54/120.52  -b000000000001000000000000000(bitIndex1).
% 121.54/120.52  -b000000000001000000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3588(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000000100000000000000(B))))).
% 121.54/120.52  -b000000000000100000000000000(bitIndex26).
% 121.54/120.52  -b000000000000100000000000000(bitIndex25).
% 121.54/120.52  -b000000000000100000000000000(bitIndex24).
% 121.54/120.52  -b000000000000100000000000000(bitIndex23).
% 121.54/120.52  -b000000000000100000000000000(bitIndex22).
% 121.54/120.52  -b000000000000100000000000000(bitIndex21).
% 121.54/120.52  -b000000000000100000000000000(bitIndex20).
% 121.54/120.52  -b000000000000100000000000000(bitIndex19).
% 121.54/120.52  -b000000000000100000000000000(bitIndex18).
% 121.54/120.52  -b000000000000100000000000000(bitIndex17).
% 121.54/120.52  -b000000000000100000000000000(bitIndex16).
% 121.54/120.52  -b000000000000100000000000000(bitIndex15).
% 121.54/120.52  b000000000000100000000000000(bitIndex14).
% 121.54/120.52  -b000000000000100000000000000(bitIndex13).
% 121.54/120.52  -b000000000000100000000000000(bitIndex12).
% 121.54/120.52  -b000000000000100000000000000(bitIndex11).
% 121.54/120.52  -b000000000000100000000000000(bitIndex10).
% 121.54/120.52  -b000000000000100000000000000(bitIndex9).
% 121.54/120.52  -b000000000000100000000000000(bitIndex8).
% 121.54/120.52  -b000000000000100000000000000(bitIndex7).
% 121.54/120.52  -b000000000000100000000000000(bitIndex6).
% 121.54/120.52  -b000000000000100000000000000(bitIndex5).
% 121.54/120.52  -b000000000000100000000000000(bitIndex4).
% 121.54/120.52  -b000000000000100000000000000(bitIndex3).
% 121.54/120.52  -b000000000000100000000000000(bitIndex2).
% 121.54/120.52  -b000000000000100000000000000(bitIndex1).
% 121.54/120.52  -b000000000000100000000000000(bitIndex0).
% 121.54/120.52  all VarCurr (v3587(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000000010000000000000(B))))).
% 121.54/120.52  -b000000000000010000000000000(bitIndex26).
% 121.54/120.52  -b000000000000010000000000000(bitIndex25).
% 121.54/120.52  -b000000000000010000000000000(bitIndex24).
% 121.54/120.52  -b000000000000010000000000000(bitIndex23).
% 121.54/120.52  -b000000000000010000000000000(bitIndex22).
% 121.54/120.52  -b000000000000010000000000000(bitIndex21).
% 121.54/120.52  -b000000000000010000000000000(bitIndex20).
% 121.54/120.52  -b000000000000010000000000000(bitIndex19).
% 121.54/120.52  -b000000000000010000000000000(bitIndex18).
% 121.54/120.52  -b000000000000010000000000000(bitIndex17).
% 121.54/120.52  -b000000000000010000000000000(bitIndex16).
% 121.54/120.52  -b000000000000010000000000000(bitIndex15).
% 121.54/120.52  -b000000000000010000000000000(bitIndex14).
% 121.54/120.52  b000000000000010000000000000(bitIndex13).
% 121.54/120.52  -b000000000000010000000000000(bitIndex12).
% 121.54/120.52  -b000000000000010000000000000(bitIndex11).
% 121.54/120.52  -b000000000000010000000000000(bitIndex10).
% 121.54/120.52  -b000000000000010000000000000(bitIndex9).
% 121.54/120.52  -b000000000000010000000000000(bitIndex8).
% 121.54/120.52  -b000000000000010000000000000(bitIndex7).
% 121.54/120.52  -b000000000000010000000000000(bitIndex6).
% 121.54/120.52  -b000000000000010000000000000(bitIndex5).
% 121.54/120.52  -b000000000000010000000000000(bitIndex4).
% 121.54/120.53  -b000000000000010000000000000(bitIndex3).
% 121.54/120.53  -b000000000000010000000000000(bitIndex2).
% 121.54/120.53  -b000000000000010000000000000(bitIndex1).
% 121.54/120.53  -b000000000000010000000000000(bitIndex0).
% 121.54/120.53  all VarCurr (v3586(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000000001000000000000(B))))).
% 121.54/120.53  -b000000000000001000000000000(bitIndex26).
% 121.54/120.53  -b000000000000001000000000000(bitIndex25).
% 121.54/120.53  -b000000000000001000000000000(bitIndex24).
% 121.54/120.53  -b000000000000001000000000000(bitIndex23).
% 121.54/120.53  -b000000000000001000000000000(bitIndex22).
% 121.54/120.53  -b000000000000001000000000000(bitIndex21).
% 121.54/120.53  -b000000000000001000000000000(bitIndex20).
% 121.54/120.53  -b000000000000001000000000000(bitIndex19).
% 121.54/120.53  -b000000000000001000000000000(bitIndex18).
% 121.54/120.53  -b000000000000001000000000000(bitIndex17).
% 121.54/120.53  -b000000000000001000000000000(bitIndex16).
% 121.54/120.53  -b000000000000001000000000000(bitIndex15).
% 121.54/120.53  -b000000000000001000000000000(bitIndex14).
% 121.54/120.53  -b000000000000001000000000000(bitIndex13).
% 121.54/120.53  b000000000000001000000000000(bitIndex12).
% 121.54/120.53  -b000000000000001000000000000(bitIndex11).
% 121.54/120.53  -b000000000000001000000000000(bitIndex10).
% 121.54/120.53  -b000000000000001000000000000(bitIndex9).
% 121.54/120.53  -b000000000000001000000000000(bitIndex8).
% 121.54/120.53  -b000000000000001000000000000(bitIndex7).
% 121.54/120.53  -b000000000000001000000000000(bitIndex6).
% 121.54/120.53  -b000000000000001000000000000(bitIndex5).
% 121.54/120.53  -b000000000000001000000000000(bitIndex4).
% 121.54/120.53  -b000000000000001000000000000(bitIndex3).
% 121.54/120.53  -b000000000000001000000000000(bitIndex2).
% 121.54/120.53  -b000000000000001000000000000(bitIndex1).
% 121.54/120.53  -b000000000000001000000000000(bitIndex0).
% 121.54/120.53  all VarCurr (v3585(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000000000100000000000(B))))).
% 121.54/120.53  -b000000000000000100000000000(bitIndex26).
% 121.54/120.53  -b000000000000000100000000000(bitIndex25).
% 121.54/120.53  -b000000000000000100000000000(bitIndex24).
% 121.54/120.53  -b000000000000000100000000000(bitIndex23).
% 121.54/120.53  -b000000000000000100000000000(bitIndex22).
% 121.54/120.53  -b000000000000000100000000000(bitIndex21).
% 121.54/120.53  -b000000000000000100000000000(bitIndex20).
% 121.54/120.53  -b000000000000000100000000000(bitIndex19).
% 121.54/120.53  -b000000000000000100000000000(bitIndex18).
% 121.54/120.53  -b000000000000000100000000000(bitIndex17).
% 121.54/120.53  -b000000000000000100000000000(bitIndex16).
% 121.54/120.53  -b000000000000000100000000000(bitIndex15).
% 121.54/120.53  -b000000000000000100000000000(bitIndex14).
% 121.54/120.53  -b000000000000000100000000000(bitIndex13).
% 121.54/120.53  -b000000000000000100000000000(bitIndex12).
% 121.54/120.53  b000000000000000100000000000(bitIndex11).
% 121.54/120.53  -b000000000000000100000000000(bitIndex10).
% 121.54/120.53  -b000000000000000100000000000(bitIndex9).
% 121.54/120.53  -b000000000000000100000000000(bitIndex8).
% 121.54/120.53  -b000000000000000100000000000(bitIndex7).
% 121.54/120.53  -b000000000000000100000000000(bitIndex6).
% 121.54/120.53  -b000000000000000100000000000(bitIndex5).
% 121.54/120.53  -b000000000000000100000000000(bitIndex4).
% 121.54/120.53  -b000000000000000100000000000(bitIndex3).
% 121.54/120.53  -b000000000000000100000000000(bitIndex2).
% 121.54/120.53  -b000000000000000100000000000(bitIndex1).
% 121.54/120.53  -b000000000000000100000000000(bitIndex0).
% 121.54/120.53  all VarCurr (v3584(VarCurr)-> (all B (range_26_0(B)-> (v3562(VarCurr,B)<->b000000000000000010000000000(B))))).
% 121.54/120.53  all B (range_26_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B).
% 121.54/120.53  -b000000000000000010000000000(bitIndex26).
% 121.54/120.53  -b000000000000000010000000000(bitIndex25).
% 121.54/120.53  -b000000000000000010000000000(bitIndex24).
% 121.54/120.53  -b000000000000000010000000000(bitIndex23).
% 121.54/120.53  -b000000000000000010000000000(bitIndex22).
% 121.54/120.53  -b000000000000000010000000000(bitIndex21).
% 121.54/120.53  -b000000000000000010000000000(bitIndex20).
% 121.54/120.53  -b000000000000000010000000000(bitIndex19).
% 121.54/120.53  -b000000000000000010000000000(bitIndex18).
% 121.54/120.53  -b000000000000000010000000000(bitIndex17).
% 121.54/120.53  -b000000000000000010000000000(bitIndex16).
% 121.54/120.53  -b000000000000000010000000000(bitIndex15).
% 121.54/120.53  -b000000000000000010000000000(bitIndex14).
% 121.54/120.53  -b000000000000000010000000000(bitIndex13).
% 121.54/120.53  -b000000000000000010000000000(bitIndex12).
% 121.54/120.53  -b000000000000000010000000000(bitIndex11).
% 121.54/120.53  b000000000000000010000000000(bitIndex10).
% 121.54/120.53  -b000000000000000010000000000(bitIndex9).
% 121.54/120.53  -b000000000000000010000000000(bitIndex8).
% 121.54/120.53  -b000000000000000010000000000(bitIndex7).
% 121.54/120.53  -b000000000000000010000000000(bitIndex6).
% 121.54/120.53  -b000000000000000010000000000(bitIndex5).
% 121.54/120.53  -b000000000000000010000000000(bitIndex4).
% 121.54/120.53  -b000000000000000010000000000(bitIndex3).
% 121.54/120.53  -b000000000000000010000000000(bitIndex2).
% 121.54/120.53  -b000000000000000010000000000(bitIndex1).
% 121.54/120.53  -b000000000000000010000000000(bitIndex0).
% 121.54/120.53  all VarCurr (v3599(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  b1111(bitIndex3).
% 121.54/120.53  b1111(bitIndex2).
% 121.54/120.53  b1111(bitIndex1).
% 121.54/120.53  b1111(bitIndex0).
% 121.54/120.53  all VarCurr (v3598(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  b1110(bitIndex3).
% 121.54/120.53  b1110(bitIndex2).
% 121.54/120.53  b1110(bitIndex1).
% 121.54/120.53  -b1110(bitIndex0).
% 121.54/120.53  all VarCurr (v3597(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3596(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3595(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3594(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3593(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3592(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$T)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3591(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3590(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3589(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3588(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$T)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3587(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3586(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$T)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all VarCurr (v3585(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$T)).
% 121.54/120.53  all VarCurr (v3584(VarCurr)<-> (v3564(VarCurr,bitIndex3)<->$F)& (v3564(VarCurr,bitIndex2)<->$F)& (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$F)).
% 121.54/120.53  all B (range_3_0(B)-> (v3564(constB0,B)<->$F)).
% 121.54/120.53  all VarCurr (v3542(VarCurr)<->v3544(VarCurr)).
% 121.54/120.53  all VarCurr (v3544(VarCurr)<->v183(VarCurr)).
% 121.54/120.53  all VarCurr (v3391(VarCurr,bitIndex1)<->v3393(VarCurr,bitIndex1)).
% 121.54/120.53  all VarCurr (v3393(VarCurr,bitIndex1)<->v3395(VarCurr,bitIndex1)).
% 121.54/120.53  all VarCurr (v3395(VarCurr,bitIndex1)<->v3397(VarCurr,bitIndex1)).
% 121.54/120.53  all VarCurr (v3397(VarCurr,bitIndex1)<->v3536(VarCurr,bitIndex1)).
% 121.54/120.53  all VarCurr (v3536(VarCurr,bitIndex0)<->v3540(VarCurr)).
% 121.54/120.53  all VarCurr (v3536(VarCurr,bitIndex1)<->v3537(VarCurr)).
% 121.54/120.53  all VarCurr (-v3540(VarCurr)<->v3399(VarCurr,bitIndex0)).
% 121.54/120.53  all VarCurr (-v3537(VarCurr)<->v3538(VarCurr)).
% 121.54/120.53  all VarCurr (v3538(VarCurr)<->v3399(VarCurr,bitIndex1)&v3539(VarCurr)).
% 121.54/120.53  all VarCurr (-v3539(VarCurr)<->v3535(VarCurr)).
% 121.54/120.53  all VarCurr (v3535(VarCurr)<->v183(VarCurr)).
% 121.54/120.53  all VarCurr (-v3478(VarCurr)& -v3490(VarCurr)& -v3499(VarCurr)& -v3507(VarCurr)& -v3514(VarCurr)& -v3520(VarCurr)& -v3525(VarCurr)& -v3529(VarCurr)& -v3532(VarCurr)& -v3533(VarCurr)-> (v3399(VarCurr,bitIndex1)<->$F)).
% 121.54/120.53  all VarCurr (v3533(VarCurr)-> (v3399(VarCurr,bitIndex1)<->$T)).
% 121.54/120.53  all VarCurr (v3532(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3474(VarCurr,bitIndex29))).
% 121.54/120.54  all VarCurr (v3529(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3530(VarCurr))).
% 121.54/120.54  all VarCurr (v3525(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3526(VarCurr))).
% 121.54/120.54  all VarCurr (v3520(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3521(VarCurr))).
% 121.54/120.54  all VarCurr (v3514(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3515(VarCurr))).
% 121.54/120.54  all VarCurr (v3507(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3508(VarCurr))).
% 121.54/120.54  all VarCurr (v3499(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3500(VarCurr))).
% 121.54/120.54  all VarCurr (v3490(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3491(VarCurr))).
% 121.54/120.54  all VarCurr (v3478(VarCurr)-> (v3399(VarCurr,bitIndex1)<->v3479(VarCurr))).
% 121.54/120.54  all VarCurr (v3533(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$T)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$T)).
% 121.54/120.54  all VarCurr (v3532(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$T)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$F)).
% 121.54/120.54  all VarCurr (v3530(VarCurr)<->v3474(VarCurr,bitIndex28)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3529(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$T)& (v3401(VarCurr,bitIndex1)<->$T)& (v3401(VarCurr,bitIndex0)<->$T)).
% 121.54/120.54  -b0111(bitIndex3).
% 121.54/120.54  b0111(bitIndex2).
% 121.54/120.54  b0111(bitIndex1).
% 121.54/120.54  b0111(bitIndex0).
% 121.54/120.54  all VarCurr (v3526(VarCurr)<->v3528(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3528(VarCurr)<->v3474(VarCurr,bitIndex27)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3525(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$T)& (v3401(VarCurr,bitIndex1)<->$T)& (v3401(VarCurr,bitIndex0)<->$F)).
% 121.54/120.54  -b0110(bitIndex3).
% 121.54/120.54  b0110(bitIndex2).
% 121.54/120.54  b0110(bitIndex1).
% 121.54/120.54  -b0110(bitIndex0).
% 121.54/120.54  all VarCurr (v3521(VarCurr)<->v3523(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3523(VarCurr)<->v3524(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3524(VarCurr)<->v3474(VarCurr,bitIndex26)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3520(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$T)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$T)).
% 121.54/120.54  all VarCurr (v3515(VarCurr)<->v3517(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3517(VarCurr)<->v3518(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3518(VarCurr)<->v3519(VarCurr)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3519(VarCurr)<->v3474(VarCurr,bitIndex25)&v3474(VarCurr,bitIndex26)).
% 121.54/120.54  all VarCurr (v3514(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$T)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$F)).
% 121.54/120.54  all VarCurr (v3508(VarCurr)<->v3510(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3510(VarCurr)<->v3511(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3511(VarCurr)<->v3512(VarCurr)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3512(VarCurr)<->v3513(VarCurr)&v3474(VarCurr,bitIndex26)).
% 121.54/120.54  all VarCurr (v3513(VarCurr)<->v3474(VarCurr,bitIndex24)&v3474(VarCurr,bitIndex25)).
% 121.54/120.54  all VarCurr (v3507(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$T)& (v3401(VarCurr,bitIndex0)<->$T)).
% 121.54/120.54  all VarCurr (v3500(VarCurr)<->v3502(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3502(VarCurr)<->v3503(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3503(VarCurr)<->v3504(VarCurr)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3504(VarCurr)<->v3505(VarCurr)&v3474(VarCurr,bitIndex26)).
% 121.54/120.54  all VarCurr (v3505(VarCurr)<->v3506(VarCurr)&v3474(VarCurr,bitIndex25)).
% 121.54/120.54  all VarCurr (v3506(VarCurr)<->v3474(VarCurr,bitIndex23)&v3474(VarCurr,bitIndex24)).
% 121.54/120.54  all VarCurr (v3499(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$T)& (v3401(VarCurr,bitIndex0)<->$F)).
% 121.54/120.54  all VarCurr (v3491(VarCurr)<->v3493(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3493(VarCurr)<->v3494(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3494(VarCurr)<->v3495(VarCurr)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3495(VarCurr)<->v3496(VarCurr)&v3474(VarCurr,bitIndex26)).
% 121.54/120.54  all VarCurr (v3496(VarCurr)<->v3497(VarCurr)&v3474(VarCurr,bitIndex25)).
% 121.54/120.54  all VarCurr (v3497(VarCurr)<->v3498(VarCurr)&v3474(VarCurr,bitIndex24)).
% 121.54/120.54  all VarCurr (v3498(VarCurr)<->v3474(VarCurr,bitIndex22)&v3474(VarCurr,bitIndex23)).
% 121.54/120.54  all VarCurr (v3490(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$T)).
% 121.54/120.54  all VarCurr (v3479(VarCurr)<->v3481(VarCurr)&v3474(VarCurr,bitIndex29)).
% 121.54/120.54  all VarCurr (v3481(VarCurr)<->v3482(VarCurr)&v3474(VarCurr,bitIndex28)).
% 121.54/120.54  all VarCurr (v3482(VarCurr)<->v3483(VarCurr)&v3474(VarCurr,bitIndex27)).
% 121.54/120.54  all VarCurr (v3483(VarCurr)<->v3484(VarCurr)&v3474(VarCurr,bitIndex26)).
% 121.54/120.54  all VarCurr (v3484(VarCurr)<->v3485(VarCurr)&v3474(VarCurr,bitIndex25)).
% 121.54/120.54  all VarCurr (v3485(VarCurr)<->v3486(VarCurr)&v3474(VarCurr,bitIndex24)).
% 121.54/120.54  all VarCurr (v3486(VarCurr)<->v3487(VarCurr)&v3474(VarCurr,bitIndex23)).
% 121.54/120.54  all VarCurr (v3487(VarCurr)<->v3474(VarCurr,bitIndex21)&v3474(VarCurr,bitIndex22)).
% 121.54/120.54  -v3474(constB0,bitIndex37).
% 121.54/120.54  -v3474(constB0,bitIndex36).
% 121.54/120.54  -v3474(constB0,bitIndex35).
% 121.54/120.54  -v3474(constB0,bitIndex34).
% 121.54/120.54  -v3474(constB0,bitIndex33).
% 121.54/120.54  -v3474(constB0,bitIndex32).
% 121.54/120.54  -v3474(constB0,bitIndex31).
% 121.54/120.54  -v3474(constB0,bitIndex30).
% 121.54/120.54  -v3474(constB0,bitIndex29).
% 121.54/120.54  -v3474(constB0,bitIndex28).
% 121.54/120.54  -v3474(constB0,bitIndex27).
% 121.54/120.54  -v3474(constB0,bitIndex26).
% 121.54/120.54  -v3474(constB0,bitIndex25).
% 121.54/120.54  -v3474(constB0,bitIndex24).
% 121.54/120.54  -v3474(constB0,bitIndex23).
% 121.54/120.54  -v3474(constB0,bitIndex22).
% 121.54/120.54  -v3474(constB0,bitIndex21).
% 121.54/120.54  -v3474(constB0,bitIndex20).
% 121.54/120.54  -v3474(constB0,bitIndex19).
% 121.54/120.54  -v3474(constB0,bitIndex18).
% 121.54/120.54  -v3474(constB0,bitIndex17).
% 121.54/120.54  -v3474(constB0,bitIndex16).
% 121.54/120.54  -v3474(constB0,bitIndex15).
% 121.54/120.54  -v3474(constB0,bitIndex14).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex37).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex36).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex35).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex34).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex33).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex32).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex31).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex30).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex29).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex28).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex27).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex26).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex25).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex24).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex23).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex22).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex21).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex20).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex19).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex18).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex17).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex16).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex15).
% 121.54/120.54  -bxxxxxxxxxxxxxxxxxxxxxxxx000000000000000000000000xxxxxxxxxxxxxx(bitIndex14).
% 121.54/120.54  all VarCurr (v3478(VarCurr)<-> (v3401(VarCurr,bitIndex3)<->$F)& (v3401(VarCurr,bitIndex2)<->$F)& (v3401(VarCurr,bitIndex1)<->$F)& (v3401(VarCurr,bitIndex0)<->$F)).
% 121.54/120.54  all VarCurr (-v3403(VarCurr)-> (all B (range_3_0(B)-> (v3401(VarCurr,B)<->v3427(VarCurr,B))))).
% 121.54/120.54  all VarCurr (v3403(VarCurr)-> (all B (range_3_0(B)-> (v3401(VarCurr,B)<->v3453(VarCurr,B))))).
% 121.54/120.54  all VarCurr (v3453(VarCurr,bitIndex0)<->v3471(VarCurr)).
% 121.54/120.54  all VarCurr (v3453(VarCurr,bitIndex1)<->v3469(VarCurr)).
% 121.54/120.54  all VarCurr (v3453(VarCurr,bitIndex2)<->v3464(VarCurr)).
% 121.54/120.54  all VarCurr (v3453(VarCurr,bitIndex3)<->v3455(VarCurr)).
% 121.54/120.54  all VarCurr (v3469(VarCurr)<->v3470(VarCurr)&v3472(VarCurr)).
% 121.54/120.54  all VarCurr (v3472(VarCurr)<->v3427(VarCurr,bitIndex0)|v3461(VarCurr)).
% 121.54/120.54  all VarCurr (v3470(VarCurr)<->v3471(VarCurr)|v3427(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (-v3471(VarCurr)<->v3427(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3464(VarCurr)<->v3465(VarCurr)&v3468(VarCurr)).
% 121.54/120.54  all VarCurr (v3468(VarCurr)<->v3459(VarCurr)|v3427(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3465(VarCurr)<->v3466(VarCurr)|v3467(VarCurr)).
% 121.54/120.54  all VarCurr (-v3467(VarCurr)<->v3427(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (-v3466(VarCurr)<->v3459(VarCurr)).
% 121.54/120.54  all VarCurr (v3455(VarCurr)<->v3456(VarCurr)&v3463(VarCurr)).
% 121.54/120.54  all VarCurr (v3463(VarCurr)<->v3458(VarCurr)|v3427(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3456(VarCurr)<->v3457(VarCurr)|v3462(VarCurr)).
% 121.54/120.54  all VarCurr (-v3462(VarCurr)<->v3427(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (-v3457(VarCurr)<->v3458(VarCurr)).
% 121.54/120.54  all VarCurr (v3458(VarCurr)<->v3459(VarCurr)&v3427(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3459(VarCurr)<->v3427(VarCurr,bitIndex1)|v3460(VarCurr)).
% 121.54/120.54  all VarCurr (v3460(VarCurr)<->v3427(VarCurr,bitIndex0)&v3461(VarCurr)).
% 121.54/120.54  all VarCurr (-v3461(VarCurr)<->v3427(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3427(VarCurr,bitIndex3)<->v3429(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3429(VarCurr,bitIndex3)<->v3431(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3431(VarCurr,bitIndex3)<->v3433(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3433(VarCurr,bitIndex3)<->v3435(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3435(VarCurr,bitIndex3)<->v3437(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3437(VarCurr,bitIndex3)<->v3439(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3439(VarCurr,bitIndex3)<->v3441(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3441(VarCurr,bitIndex3)<->v3443(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3443(VarCurr,bitIndex3)<->v3421(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3421(VarCurr,bitIndex3)<->v3423(VarCurr,bitIndex3)).
% 121.54/120.54  all VarCurr (v3423(VarCurr,bitIndex3)<->v3451(VarCurr)).
% 121.54/120.54  all VarCurr (v3427(VarCurr,bitIndex2)<->v3429(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3429(VarCurr,bitIndex2)<->v3431(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3431(VarCurr,bitIndex2)<->v3433(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3433(VarCurr,bitIndex2)<->v3435(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3435(VarCurr,bitIndex2)<->v3437(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3437(VarCurr,bitIndex2)<->v3439(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3439(VarCurr,bitIndex2)<->v3441(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3441(VarCurr,bitIndex2)<->v3443(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3443(VarCurr,bitIndex2)<->v3421(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3421(VarCurr,bitIndex2)<->v3423(VarCurr,bitIndex2)).
% 121.54/120.54  all VarCurr (v3423(VarCurr,bitIndex2)<->v3449(VarCurr)).
% 121.54/120.54  all VarCurr (v3427(VarCurr,bitIndex1)<->v3429(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3429(VarCurr,bitIndex1)<->v3431(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3431(VarCurr,bitIndex1)<->v3433(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3433(VarCurr,bitIndex1)<->v3435(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3435(VarCurr,bitIndex1)<->v3437(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3437(VarCurr,bitIndex1)<->v3439(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3439(VarCurr,bitIndex1)<->v3441(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3441(VarCurr,bitIndex1)<->v3443(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3443(VarCurr,bitIndex1)<->v3421(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3421(VarCurr,bitIndex1)<->v3423(VarCurr,bitIndex1)).
% 121.54/120.54  all VarCurr (v3423(VarCurr,bitIndex1)<->v3447(VarCurr)).
% 121.54/120.54  all VarCurr (v3427(VarCurr,bitIndex0)<->v3429(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3429(VarCurr,bitIndex0)<->v3431(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3431(VarCurr,bitIndex0)<->v3433(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3433(VarCurr,bitIndex0)<->v3435(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3435(VarCurr,bitIndex0)<->v3437(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3437(VarCurr,bitIndex0)<->v3439(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3439(VarCurr,bitIndex0)<->v3441(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3441(VarCurr,bitIndex0)<->v3443(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3443(VarCurr,bitIndex0)<->v3421(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3421(VarCurr,bitIndex0)<->v3423(VarCurr,bitIndex0)).
% 121.54/120.54  all VarCurr (v3423(VarCurr,bitIndex0)<->v3445(VarCurr)).
% 121.54/120.54  all VarCurr (v3403(VarCurr)<->v3405(VarCurr)).
% 121.54/120.54  all VarCurr (v3405(VarCurr)<->v3407(VarCurr)).
% 121.54/120.54  all VarCurr (v3407(VarCurr)<->v3409(VarCurr)).
% 121.54/120.55  all VarCurr (v3409(VarCurr)<->v3411(VarCurr)).
% 121.54/120.55  all VarCurr (v3411(VarCurr)<->v3413(VarCurr)).
% 121.54/120.55  all VarCurr (v3413(VarCurr)<->v3415(VarCurr)).
% 121.54/120.55  all VarCurr (v3415(VarCurr)<->v3417(VarCurr)).
% 121.54/120.55  all VarCurr (v3417(VarCurr)<->v3419(VarCurr)).
% 121.54/120.55  all VarCurr (v3419(VarCurr)<->v3421(VarCurr,bitIndex8)).
% 121.54/120.55  all VarCurr (v3421(VarCurr,bitIndex8)<->v3423(VarCurr,bitIndex8)).
% 121.54/120.55  all VarCurr (v3423(VarCurr,bitIndex8)<->v3425(VarCurr)).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3363(VarCurr,B)<->v3365(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3365(VarCurr,B)<->v3367(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3367(VarCurr,B)<->v3369(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3369(VarCurr,B)<->v3371(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3371(VarCurr,B)<->v3373(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3373(VarCurr,B)<->v3375(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3375(VarCurr,B)<->v3377(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3377(VarCurr,B)<->v3379(VarCurr,B))).
% 121.54/120.55  all VarCurr B (range_1_0(B)-> (v3379(VarCurr,B)<->v3381(VarCurr,B))).
% 121.54/120.55  all VarCurr ((v3381(VarCurr,bitIndex1)<->v193(VarCurr,bitIndex9))& (v3381(VarCurr,bitIndex0)<->v193(VarCurr,bitIndex8))).
% 121.54/120.55  all VarCurr B (range_9_8(B)-> (v193(VarCurr,B)<->v195(VarCurr,B))).
% 121.54/120.55  all B (range_9_8(B)<->bitIndex8=B|bitIndex9=B).
% 121.54/120.55  all VarCurr (v195(VarCurr,bitIndex9)<->v3385(VarCurr)).
% 121.54/120.55  all VarCurr (v195(VarCurr,bitIndex8)<->v3383(VarCurr)).
% 121.54/120.55  all VarCurr (v2597(VarCurr)<->v2599(VarCurr)).
% 121.54/120.55  all VarCurr (v2599(VarCurr)<->v2601(VarCurr)).
% 121.54/120.55  all VarCurr (v2601(VarCurr)<->v2603(VarCurr)).
% 121.54/120.55  all VarCurr (v2603(VarCurr)<->v2605(VarCurr)).
% 121.54/120.55  all VarCurr (-v3346(VarCurr)-> (v2605(VarCurr)<->v3347(VarCurr))).
% 121.54/120.55  all VarCurr (v3346(VarCurr)-> (v2605(VarCurr)<->$F)).
% 121.54/120.55  all VarCurr (-v3348(VarCurr)& -v3349(VarCurr)& -v3352(VarCurr)& -v3353(VarCurr)& -v3354(VarCurr)-> (v3347(VarCurr)<->v3239(VarCurr,bitIndex1))).
% 121.54/120.55  all VarCurr (v3354(VarCurr)-> (v3347(VarCurr)<->v3239(VarCurr,bitIndex2))).
% 121.54/120.55  all VarCurr (v3353(VarCurr)-> (v3347(VarCurr)<->v3239(VarCurr,bitIndex3))).
% 121.54/120.55  all VarCurr (v3352(VarCurr)-> (v3347(VarCurr)<->v3239(VarCurr,bitIndex2))).
% 121.54/120.55  all VarCurr (v3349(VarCurr)-> (v3347(VarCurr)<->v3239(VarCurr,bitIndex3))).
% 121.54/120.55  all VarCurr (v3348(VarCurr)-> (v3347(VarCurr)<->$F)).
% 121.54/120.55  all VarCurr (v3357(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$T)& (v2758(VarCurr,bitIndex1)<->$T)& (v2758(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3354(VarCurr)<->v3355(VarCurr)|v3356(VarCurr)).
% 121.54/120.55  all VarCurr (v3356(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$T)& (v2758(VarCurr,bitIndex1)<->$T)& (v2758(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  b110(bitIndex2).
% 121.54/120.55  b110(bitIndex1).
% 121.54/120.55  -b110(bitIndex0).
% 121.54/120.55  all VarCurr (v3355(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$T)& (v2758(VarCurr,bitIndex1)<->$F)& (v2758(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  b101(bitIndex2).
% 121.54/120.55  -b101(bitIndex1).
% 121.54/120.55  b101(bitIndex0).
% 121.54/120.55  all VarCurr (v3353(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$T)& (v2758(VarCurr,bitIndex1)<->$F)& (v2758(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  all VarCurr (v3352(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$F)& (v2758(VarCurr,bitIndex1)<->$T)& (v2758(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3349(VarCurr)<->v3350(VarCurr)|v3351(VarCurr)).
% 121.54/120.55  all VarCurr (v3351(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$F)& (v2758(VarCurr,bitIndex1)<->$T)& (v2758(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  all VarCurr (v3350(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$F)& (v2758(VarCurr,bitIndex1)<->$F)& (v2758(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3348(VarCurr)<-> (v2758(VarCurr,bitIndex2)<->$F)& (v2758(VarCurr,bitIndex1)<->$F)& (v2758(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  all VarCurr (-v3346(VarCurr)<->v2607(VarCurr)).
% 121.54/120.55  all VarCurr (v3239(VarCurr,bitIndex2)<->v3241(VarCurr,bitIndex2)).
% 121.54/120.55  all VarCurr (v3241(VarCurr,bitIndex2)<->v3243(VarCurr,bitIndex2)).
% 121.54/120.55  all VarCurr (v3239(VarCurr,bitIndex3)<->v3241(VarCurr,bitIndex3)).
% 121.54/120.55  all VarCurr (v3241(VarCurr,bitIndex3)<->v3243(VarCurr,bitIndex3)).
% 121.54/120.55  all VarNext (v3243(VarNext,bitIndex3)<->v3336(VarNext,bitIndex3)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3337(VarNext)-> (all B (range_3_0(B)-> (v3336(VarNext,B)<->v3243(VarCurr,B)))))).
% 121.54/120.55  all VarNext (v3337(VarNext)-> (all B (range_3_0(B)-> (v3336(VarNext,B)<->v3293(VarNext,B))))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3337(VarNext)<->v3338(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3338(VarNext)<->v3340(VarNext)&v3280(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3340(VarNext)<->v3287(VarNext))).
% 121.54/120.55  all VarCurr (v3247(VarCurr,bitIndex3)<->v3268(VarCurr,bitIndex3)).
% 121.54/120.55  all VarCurr (v3257(VarCurr)<->v3259(VarCurr)).
% 121.54/120.55  all VarCurr (v3259(VarCurr)<->v3261(VarCurr)).
% 121.54/120.55  all VarCurr (v3261(VarCurr)<->v3263(VarCurr)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3321(VarNext)-> (v3263(VarNext)<->v3263(VarCurr)))).
% 121.54/120.55  all VarNext (v3321(VarNext)-> (v3263(VarNext)<->v3331(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3331(VarNext)<->v3329(VarCurr))).
% 121.54/120.55  all VarCurr (-v3332(VarCurr)-> (v3329(VarCurr)<->v3267(VarCurr))).
% 121.54/120.55  all VarCurr (v3332(VarCurr)-> (v3329(VarCurr)<->$F)).
% 121.54/120.55  all VarCurr (-v3332(VarCurr)<->v3265(VarCurr)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3321(VarNext)<->v3322(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3322(VarNext)<->v3323(VarNext)&v3318(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3323(VarNext)<->v3325(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3325(VarNext)<->v3318(VarCurr))).
% 121.54/120.55  all VarCurr (v3318(VarCurr)<->v2549(VarCurr)).
% 121.54/120.55  all VarCurr (-v3306(VarCurr)& -v3310(VarCurr)-> (v3267(VarCurr)<->v3316(VarCurr))).
% 121.54/120.55  all VarCurr (v3310(VarCurr)-> (v3267(VarCurr)<->v3311(VarCurr))).
% 121.54/120.55  all VarCurr (v3306(VarCurr)-> (v3267(VarCurr)<->v3308(VarCurr))).
% 121.54/120.55  all VarCurr (-v3316(VarCurr)<->v2607(VarCurr)).
% 121.54/120.55  all VarCurr (v3313(VarCurr)<->v3314(VarCurr)|v3315(VarCurr)).
% 121.54/120.55  all VarCurr (v3315(VarCurr)<-> (v3307(VarCurr,bitIndex1)<->$T)& (v3307(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3314(VarCurr)<-> (v3307(VarCurr,bitIndex1)<->$T)& (v3307(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  all VarCurr (v3311(VarCurr)<->v3239(VarCurr,bitIndex1)&v3312(VarCurr)).
% 121.54/120.55  all VarCurr (-v3312(VarCurr)<->v2607(VarCurr)).
% 121.54/120.55  all VarCurr (v3310(VarCurr)<-> (v3307(VarCurr,bitIndex1)<->$F)& (v3307(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3308(VarCurr)<->v3239(VarCurr,bitIndex0)&v3309(VarCurr)).
% 121.54/120.55  all VarCurr (-v3309(VarCurr)<->v2607(VarCurr)).
% 121.54/120.55  all VarCurr (v3306(VarCurr)<-> (v3307(VarCurr,bitIndex1)<->$F)& (v3307(VarCurr,bitIndex0)<->$F)).
% 121.54/120.55  all VarCurr (v3307(VarCurr,bitIndex0)<->v3263(VarCurr)).
% 121.54/120.55  all VarCurr (v3307(VarCurr,bitIndex1)<->v3255(VarCurr)).
% 121.54/120.55  v3263(constB0)<->$F.
% 121.54/120.55  all VarCurr (v3239(VarCurr,bitIndex1)<->v3241(VarCurr,bitIndex1)).
% 121.54/120.55  all VarCurr (v3241(VarCurr,bitIndex1)<->v3243(VarCurr,bitIndex1)).
% 121.54/120.55  all VarNext (v3243(VarNext,bitIndex1)<->v3298(VarNext,bitIndex1)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3299(VarNext)-> (all B (range_3_0(B)-> (v3298(VarNext,B)<->v3243(VarCurr,B)))))).
% 121.54/120.55  all VarNext (v3299(VarNext)-> (all B (range_3_0(B)-> (v3298(VarNext,B)<->v3293(VarNext,B))))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3299(VarNext)<->v3300(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3300(VarNext)<->v3302(VarNext)&v3280(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3302(VarNext)<->v3287(VarNext))).
% 121.54/120.55  all VarCurr (v3247(VarCurr,bitIndex1)<->v3268(VarCurr,bitIndex1)).
% 121.54/120.55  all VarNext (v3243(VarNext,bitIndex2)<->v3282(VarNext,bitIndex2)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3283(VarNext)-> (all B (range_3_0(B)-> (v3282(VarNext,B)<->v3243(VarCurr,B)))))).
% 121.54/120.55  all VarNext (v3283(VarNext)-> (all B (range_3_0(B)-> (v3282(VarNext,B)<->v3293(VarNext,B))))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v3293(VarNext,B)<->v3291(VarCurr,B))))).
% 121.54/120.55  all VarCurr (-v3294(VarCurr)-> (all B (range_3_0(B)-> (v3291(VarCurr,B)<->v3247(VarCurr,B))))).
% 121.54/120.55  all VarCurr (v3294(VarCurr)-> (all B (range_3_0(B)-> (v3291(VarCurr,B)<->$F)))).
% 121.54/120.55  all VarCurr (-v3294(VarCurr)<->v3245(VarCurr)).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3283(VarNext)<->v3284(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3284(VarNext)<->v3285(VarNext)&v3280(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3285(VarNext)<->v3287(VarNext))).
% 121.54/120.55  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3287(VarNext)<->v3280(VarCurr))).
% 121.54/120.55  all VarCurr (v3280(VarCurr)<->v2549(VarCurr)).
% 121.54/120.55  all VarCurr (v3247(VarCurr,bitIndex2)<->v3268(VarCurr,bitIndex2)).
% 121.54/120.55  all VarCurr (-v3269(VarCurr)& -v3271(VarCurr)& -v3274(VarCurr)-> (all B (range_3_0(B)-> (v3268(VarCurr,B)<->v3243(VarCurr,B))))).
% 121.54/120.55  all VarCurr (v3274(VarCurr)-> (all B (range_3_0(B)-> (v3268(VarCurr,B)<->v3276(VarCurr,B))))).
% 121.54/120.55  all VarCurr (v3271(VarCurr)-> (all B (range_3_0(B)-> (v3268(VarCurr,B)<->v3273(VarCurr,B))))).
% 121.54/120.55  all VarCurr (v3269(VarCurr)-> (all B (range_3_0(B)-> (v3268(VarCurr,B)<->v3243(VarCurr,B))))).
% 121.54/120.55  all VarCurr (v3277(VarCurr)<-> (v3278(VarCurr,bitIndex1)<->$T)& (v3278(VarCurr,bitIndex0)<->$T)).
% 121.54/120.55  all VarCurr (v3278(VarCurr,bitIndex0)<->v3257(VarCurr)).
% 121.54/120.55  all VarCurr (v3278(VarCurr,bitIndex1)<->v3249(VarCurr)).
% 121.54/120.55  all VarCurr (v3276(VarCurr,bitIndex0)<->$T).
% 121.54/120.55  all VarCurr ((v3276(VarCurr,bitIndex3)<->v3243(VarCurr,bitIndex2))& (v3276(VarCurr,bitIndex2)<->v3243(VarCurr,bitIndex1))& (v3276(VarCurr,bitIndex1)<->v3243(VarCurr,bitIndex0))).
% 121.54/120.56  all VarCurr (v3274(VarCurr)<-> (v3275(VarCurr,bitIndex1)<->$T)& (v3275(VarCurr,bitIndex0)<->$F)).
% 121.54/120.56  all VarCurr (v3275(VarCurr,bitIndex0)<->v3257(VarCurr)).
% 121.54/120.56  all VarCurr (v3275(VarCurr,bitIndex1)<->v3249(VarCurr)).
% 121.54/120.56  all VarCurr ((v3273(VarCurr,bitIndex2)<->v3243(VarCurr,bitIndex3))& (v3273(VarCurr,bitIndex1)<->v3243(VarCurr,bitIndex2))& (v3273(VarCurr,bitIndex0)<->v3243(VarCurr,bitIndex1))).
% 121.54/120.56  all VarCurr (v3273(VarCurr,bitIndex3)<->$F).
% 121.54/120.56  all VarCurr (v3271(VarCurr)<-> (v3272(VarCurr,bitIndex1)<->$F)& (v3272(VarCurr,bitIndex0)<->$T)).
% 121.54/120.56  all VarCurr (v3272(VarCurr,bitIndex0)<->v3257(VarCurr)).
% 121.54/120.56  all VarCurr (v3272(VarCurr,bitIndex1)<->v3249(VarCurr)).
% 121.54/120.56  all VarCurr (v3269(VarCurr)<-> (v3270(VarCurr,bitIndex1)<->$F)& (v3270(VarCurr,bitIndex0)<->$F)).
% 121.54/120.56  all VarCurr (v3270(VarCurr,bitIndex0)<->v3257(VarCurr)).
% 121.54/120.56  all VarCurr (v3270(VarCurr,bitIndex1)<->v3249(VarCurr)).
% 121.54/120.56  all VarCurr (v3239(VarCurr,bitIndex0)<->v3241(VarCurr,bitIndex0)).
% 121.54/120.56  all VarCurr (v3241(VarCurr,bitIndex0)<->v3243(VarCurr,bitIndex0)).
% 121.54/120.56  all B (range_3_0(B)-> (v3243(constB0,B)<->$F)).
% 121.54/120.56  all VarCurr (v3265(VarCurr)<->v711(VarCurr)).
% 121.54/120.56  all VarCurr (v3249(VarCurr)<->v3251(VarCurr)).
% 121.54/120.56  all VarCurr (v3251(VarCurr)<->v3253(VarCurr)).
% 121.54/120.56  all VarCurr (v3253(VarCurr)<->v3255(VarCurr)).
% 121.54/120.56  all VarCurr (v3255(VarCurr)<->v2758(VarCurr,bitIndex2)).
% 121.54/120.56  all VarCurr (v3245(VarCurr)<->v711(VarCurr)).
% 121.54/120.56  all VarCurr B (range_2_0(B)-> (v2758(VarCurr,B)<->v2760(VarCurr,B))).
% 121.54/120.56  all VarCurr B (range_2_0(B)-> (v2760(VarCurr,B)<->v2762(VarCurr,B))).
% 121.54/120.56  all VarCurr B (range_2_0(B)-> (v2762(VarCurr,B)<->v2764(VarCurr,B))).
% 121.54/120.56  all VarCurr B (range_2_0(B)-> (v2764(VarCurr,B)<->v2766(VarCurr,B))).
% 121.54/120.56  all VarCurr ((v2766(VarCurr,bitIndex2)<->v2768(VarCurr,bitIndex3))& (v2766(VarCurr,bitIndex1)<->v2768(VarCurr,bitIndex2))& (v2766(VarCurr,bitIndex0)<->v2768(VarCurr,bitIndex1))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3226(VarNext)-> (v2768(VarNext,bitIndex3)<->v2768(VarCurr,bitIndex3)))).
% 121.54/120.56  all VarNext (v3226(VarNext)-> (v2768(VarNext,bitIndex3)<->v3234(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3234(VarNext)<->v3232(VarCurr))).
% 121.54/120.56  all VarCurr (-v3235(VarCurr)-> (v3232(VarCurr)<->v2796(VarCurr,bitIndex2))).
% 121.54/120.56  all VarCurr (v3235(VarCurr)-> (v3232(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (-v3235(VarCurr)<->v2591(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3226(VarNext)<->v3227(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3227(VarNext)<->v3228(VarNext)&v3155(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3228(VarNext)<->v3164(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3206(VarNext)-> (v2768(VarNext,bitIndex2)<->v2768(VarCurr,bitIndex2)))).
% 121.54/120.56  all VarNext (v3206(VarNext)-> (v2768(VarNext,bitIndex2)<->v3221(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3221(VarNext)<->v3219(VarCurr))).
% 121.54/120.56  all VarCurr (-v3215(VarCurr)-> (v3219(VarCurr)<->v2796(VarCurr,bitIndex1))).
% 121.54/120.56  all VarCurr (v3215(VarCurr)-> (v3219(VarCurr)<->$F)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3206(VarNext)<->v3207(VarNext)&v3214(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3214(VarNext)<->v3212(VarCurr))).
% 121.54/120.56  all VarCurr (v3212(VarCurr)<->v3215(VarCurr)|v3216(VarCurr)).
% 121.54/120.56  all VarCurr (v3216(VarCurr)<->v3217(VarCurr)&v3218(VarCurr)).
% 121.54/120.56  all VarCurr (-v3218(VarCurr)<->v3215(VarCurr)).
% 121.54/120.56  all VarCurr (-v3217(VarCurr)<->v2587(VarCurr,bitIndex2)).
% 121.54/120.56  all VarCurr (-v3215(VarCurr)<->v2591(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3207(VarNext)<->v3208(VarNext)&v3155(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3208(VarNext)<->v3164(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3186(VarNext)-> (v2768(VarNext,bitIndex1)<->v2768(VarCurr,bitIndex1)))).
% 121.54/120.56  all VarNext (v3186(VarNext)-> (v2768(VarNext,bitIndex1)<->v3201(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3201(VarNext)<->v3199(VarCurr))).
% 121.54/120.56  all VarCurr (-v3195(VarCurr)-> (v3199(VarCurr)<->v2796(VarCurr,bitIndex0))).
% 121.54/120.56  all VarCurr (v3195(VarCurr)-> (v3199(VarCurr)<->$F)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3186(VarNext)<->v3187(VarNext)&v3194(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3194(VarNext)<->v3192(VarCurr))).
% 121.54/120.56  all VarCurr (v3192(VarCurr)<->v3195(VarCurr)|v3196(VarCurr)).
% 121.54/120.56  all VarCurr (v3196(VarCurr)<->v3197(VarCurr)&v3198(VarCurr)).
% 121.54/120.56  all VarCurr (-v3198(VarCurr)<->v3195(VarCurr)).
% 121.54/120.56  all VarCurr (-v3197(VarCurr)<->v2587(VarCurr,bitIndex1)).
% 121.54/120.56  all VarCurr (-v3195(VarCurr)<->v2591(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3187(VarNext)<->v3188(VarNext)&v3155(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3188(VarNext)<->v3164(VarNext))).
% 121.54/120.56  all VarCurr (v2796(VarCurr,bitIndex2)<->v2768(VarCurr,bitIndex2)&v3183(VarCurr)).
% 121.54/120.56  all VarCurr (-v3183(VarCurr)<->v2587(VarCurr,bitIndex2)).
% 121.54/120.56  all VarCurr (v2796(VarCurr,bitIndex1)<->v2768(VarCurr,bitIndex1)&v3181(VarCurr)).
% 121.54/120.56  all VarCurr (-v3181(VarCurr)<->v2587(VarCurr,bitIndex1)).
% 121.54/120.56  all VarCurr (v2587(VarCurr,bitIndex2)<->v2770(VarCurr,bitIndex2)).
% 121.54/120.56  all VarCurr (v2796(VarCurr,bitIndex0)<->v2768(VarCurr,bitIndex0)).
% 121.54/120.56  all VarCurr (v2768(VarCurr,bitIndex0)<->v2798(VarCurr)).
% 121.54/120.56  v2768(constB0,bitIndex3)<->$F.
% 121.54/120.56  v2768(constB0,bitIndex2)<->$F.
% 121.54/120.56  v2768(constB0,bitIndex1)<->$F.
% 121.54/120.56  all VarCurr (v2798(VarCurr)<->v3175(VarCurr)&v3179(VarCurr)).
% 121.54/120.56  all VarCurr (v3179(VarCurr)<->v2992(VarCurr)&v3153(VarCurr)).
% 121.54/120.56  all VarCurr (v3175(VarCurr)<->v3176(VarCurr)&v3178(VarCurr)).
% 121.54/120.56  all VarCurr (-v3178(VarCurr)<->v2587(VarCurr,bitIndex0)).
% 121.54/120.56  all VarCurr (v3176(VarCurr)<->v2800(VarCurr)&v3177(VarCurr)).
% 121.54/120.56  all VarCurr (-v3177(VarCurr)<->v2810(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3160(VarNext)-> (v3153(VarNext)<->v3153(VarCurr)))).
% 121.54/120.56  all VarNext (v3160(VarNext)-> (v3153(VarNext)<->v3170(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3170(VarNext)<->v3168(VarCurr))).
% 121.54/120.56  all VarCurr (-v3171(VarCurr)-> (v3168(VarCurr)<->v2992(VarCurr))).
% 121.54/120.56  all VarCurr (v3171(VarCurr)-> (v3168(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3171(VarCurr)<-> (v2591(VarCurr)<->$F)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3160(VarNext)<->v3161(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3161(VarNext)<->v3162(VarNext)&v3155(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3162(VarNext)<->v3164(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3164(VarNext)<->v3155(VarCurr))).
% 121.54/120.56  v3153(constB0)<->$F.
% 121.54/120.56  all VarCurr (v3155(VarCurr)<->v3157(VarCurr)).
% 121.54/120.56  all VarCurr (v3157(VarCurr)<->v2551(VarCurr)).
% 121.54/120.56  all VarCurr (v2992(VarCurr)<->v2994(VarCurr)).
% 121.54/120.56  all VarCurr (v2994(VarCurr)<->v2996(VarCurr)).
% 121.54/120.56  all VarCurr (v2996(VarCurr)<->v2998(VarCurr)).
% 121.54/120.56  all VarCurr (v2998(VarCurr)<->v3000(VarCurr)).
% 121.54/120.56  all VarCurr (v3000(VarCurr)<->v3002(VarCurr)).
% 121.54/120.56  all VarCurr (-v3004(VarCurr)-> (v3002(VarCurr)<->$T)).
% 121.54/120.56  all VarCurr (v3004(VarCurr)-> (v3002(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3004(VarCurr)<->v3006(VarCurr)).
% 121.54/120.56  all VarCurr (v3006(VarCurr)<->v3149(VarCurr)|v3125(VarCurr)).
% 121.54/120.56  all VarCurr (v3149(VarCurr)<->v3150(VarCurr)|v3086(VarCurr)).
% 121.54/120.56  all VarCurr (v3150(VarCurr)<->v3008(VarCurr)|v3062(VarCurr)).
% 121.54/120.56  all VarCurr (v3125(VarCurr)<->v3127(VarCurr)).
% 121.54/120.56  all VarCurr (v3127(VarCurr)<->v3129(VarCurr)).
% 121.54/120.56  all VarCurr (v3129(VarCurr)<->v3131(VarCurr)).
% 121.54/120.56  all VarCurr (v3131(VarCurr)<->v3133(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3136(VarNext)-> (v3133(VarNext)<->v3133(VarCurr)))).
% 121.54/120.56  all VarNext (v3136(VarNext)-> (v3133(VarNext)<->v3144(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3144(VarNext)<->v3142(VarCurr))).
% 121.54/120.56  all VarCurr (-v3056(VarCurr)-> (v3142(VarCurr)<->v3145(VarCurr))).
% 121.54/120.56  all VarCurr (v3056(VarCurr)-> (v3142(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3145(VarCurr)<->v3020(VarCurr)&v3096(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3136(VarNext)<->v3137(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3137(VarNext)<->v3139(VarNext)&v3042(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3139(VarNext)<->v3049(VarNext))).
% 121.54/120.56  v3133(constB0)<->$F.
% 121.54/120.56  all VarCurr (v3086(VarCurr)<->v3088(VarCurr)).
% 121.54/120.56  all VarCurr (v3088(VarCurr)<->v3090(VarCurr)).
% 121.54/120.56  all VarCurr (v3090(VarCurr)<->v3092(VarCurr)).
% 121.54/120.56  all VarCurr (v3092(VarCurr)<->v3094(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3111(VarNext)-> (v3094(VarNext)<->v3094(VarCurr)))).
% 121.54/120.56  all VarNext (v3111(VarNext)-> (v3094(VarNext)<->v3119(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3119(VarNext)<->v3117(VarCurr))).
% 121.54/120.56  all VarCurr (-v3056(VarCurr)-> (v3117(VarCurr)<->v3120(VarCurr))).
% 121.54/120.56  all VarCurr (v3056(VarCurr)-> (v3117(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3120(VarCurr)<->v3121(VarCurr)&v3096(VarCurr)).
% 121.54/120.56  all VarCurr (-v3121(VarCurr)<->v3020(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3111(VarNext)<->v3112(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3112(VarNext)<->v3114(VarNext)&v3042(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3114(VarNext)<->v3049(VarNext))).
% 121.54/120.56  v3094(constB0)<->$F.
% 121.54/120.56  all VarCurr (v3096(VarCurr)<->v3098(VarCurr)).
% 121.54/120.56  all VarCurr (v3098(VarCurr)<->v3100(VarCurr)).
% 121.54/120.56  all VarCurr (v3100(VarCurr)<->v3102(VarCurr)).
% 121.54/120.56  all VarCurr (v3102(VarCurr)<->v3104(VarCurr)&v3108(VarCurr)).
% 121.54/120.56  all VarCurr (-v3108(VarCurr)<->v3106(VarCurr)).
% 121.54/120.56  v3106(constB0)<->$F.
% 121.54/120.56  v3104(constB0)<->$F.
% 121.54/120.56  all VarCurr (v3062(VarCurr)<->v3064(VarCurr)).
% 121.54/120.56  all VarCurr (v3064(VarCurr)<->v3066(VarCurr)).
% 121.54/120.56  all VarCurr (v3066(VarCurr)<->v3068(VarCurr)).
% 121.54/120.56  all VarCurr (v3068(VarCurr)<->v3070(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3073(VarNext)-> (v3070(VarNext)<->v3070(VarCurr)))).
% 121.54/120.56  all VarNext (v3073(VarNext)-> (v3070(VarNext)<->v3081(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3081(VarNext)<->v3079(VarCurr))).
% 121.54/120.56  all VarCurr (-v3056(VarCurr)-> (v3079(VarCurr)<->v3082(VarCurr))).
% 121.54/120.56  all VarCurr (v3056(VarCurr)-> (v3079(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3082(VarCurr)<->v3020(VarCurr)&v3028(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3073(VarNext)<->v3074(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3074(VarNext)<->v3076(VarNext)&v3042(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3076(VarNext)<->v3049(VarNext))).
% 121.54/120.56  v3070(constB0)<->$F.
% 121.54/120.56  all VarCurr (v3008(VarCurr)<->v3010(VarCurr)).
% 121.54/120.56  all VarCurr (v3010(VarCurr)<->v3012(VarCurr)).
% 121.54/120.56  all VarCurr (v3012(VarCurr)<->v3014(VarCurr)).
% 121.54/120.56  all VarCurr (v3014(VarCurr)<->v3016(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3045(VarNext)-> (v3016(VarNext)<->v3016(VarCurr)))).
% 121.54/120.56  all VarNext (v3045(VarNext)-> (v3016(VarNext)<->v3055(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3055(VarNext)<->v3053(VarCurr))).
% 121.54/120.56  all VarCurr (-v3056(VarCurr)-> (v3053(VarCurr)<->v3057(VarCurr))).
% 121.54/120.56  all VarCurr (v3056(VarCurr)-> (v3053(VarCurr)<->$F)).
% 121.54/120.56  all VarCurr (v3057(VarCurr)<->v3058(VarCurr)&v3028(VarCurr)).
% 121.54/120.56  all VarCurr (-v3058(VarCurr)<->v3020(VarCurr)).
% 121.54/120.56  all VarCurr (-v3056(VarCurr)<->v3018(VarCurr)).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3045(VarNext)<->v3046(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3046(VarNext)<->v3047(VarNext)&v3042(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3047(VarNext)<->v3049(VarNext))).
% 121.54/120.56  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3049(VarNext)<->v3042(VarCurr))).
% 121.54/120.57  v3016(constB0)<->$F.
% 121.54/120.57  all VarCurr (v3042(VarCurr)<->v2944(VarCurr)).
% 121.54/120.57  all VarCurr (v3028(VarCurr)<->v3030(VarCurr)).
% 121.54/120.57  all VarCurr (v3030(VarCurr)<->v3032(VarCurr)).
% 121.54/120.57  all VarCurr (v3032(VarCurr)<->v3034(VarCurr)).
% 121.54/120.57  all VarCurr (v3034(VarCurr)<->v3036(VarCurr)&v3040(VarCurr)).
% 121.54/120.57  all VarCurr (-v3040(VarCurr)<->v3038(VarCurr)).
% 121.54/120.57  v3038(constB0)<->$F.
% 121.54/120.57  v3036(constB0)<->$F.
% 121.54/120.57  all VarCurr (v3020(VarCurr)<->v3022(VarCurr)).
% 121.54/120.57  all VarCurr (v3022(VarCurr)<->v3024(VarCurr)).
% 121.54/120.57  all VarCurr (v3024(VarCurr)<->v3026(VarCurr)).
% 121.54/120.57  all VarCurr (v3026(VarCurr)<->v2913(VarCurr)).
% 121.54/120.57  all VarCurr (v3018(VarCurr)<->v2856(VarCurr)).
% 121.54/120.57  all VarCurr (v2810(VarCurr)<->v2812(VarCurr)).
% 121.54/120.57  all VarCurr (v2812(VarCurr)<->v2814(VarCurr)).
% 121.54/120.57  all VarCurr (v2814(VarCurr)<->v2816(VarCurr)).
% 121.54/120.57  all VarCurr (v2816(VarCurr)<->v2818(VarCurr)).
% 121.54/120.57  all VarCurr (v2818(VarCurr)<->v2820(VarCurr)).
% 121.54/120.57  all VarCurr (v2820(VarCurr)<->v2822(VarCurr)).
% 121.54/120.57  all VarCurr (v2822(VarCurr)<->v2824(VarCurr)).
% 121.54/120.57  all VarCurr (v2824(VarCurr)<->v2826(VarCurr)).
% 121.54/120.57  all VarCurr (v2826(VarCurr)<->v2828(VarCurr)).
% 121.54/120.57  all VarCurr (v2828(VarCurr)<->v193(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v193(VarCurr,bitIndex12)<->v195(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v195(VarCurr,bitIndex12)<->v2830(VarCurr)).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2947(VarNext)-> (v2830(VarNext)<->v2830(VarCurr)))).
% 121.54/120.57  all VarNext (v2947(VarNext)-> (v2830(VarNext)<->v2982(VarNext))).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2982(VarNext)<->v2980(VarCurr))).
% 121.54/120.57  all VarCurr (-v2832(VarCurr)-> (v2980(VarCurr)<->v2983(VarCurr))).
% 121.54/120.57  all VarCurr (v2832(VarCurr)-> (v2980(VarCurr)<->v2859(VarCurr))).
% 121.54/120.57  all VarCurr (-v2960(VarCurr)-> (v2983(VarCurr)<->v2932(VarCurr))).
% 121.54/120.57  all VarCurr (v2960(VarCurr)-> (v2983(VarCurr)<->v2984(VarCurr))).
% 121.54/120.57  all VarCurr (-v2963(VarCurr)& -v2965(VarCurr)-> (v2984(VarCurr)<->v2988(VarCurr))).
% 121.54/120.57  all VarCurr (v2965(VarCurr)-> (v2984(VarCurr)<->v2987(VarCurr))).
% 121.54/120.57  all VarCurr (v2963(VarCurr)-> (v2984(VarCurr)<->v2985(VarCurr))).
% 121.54/120.57  all VarCurr (-v2973(VarCurr)-> (v2988(VarCurr)<->v2932(VarCurr))).
% 121.54/120.57  all VarCurr (v2973(VarCurr)-> (v2988(VarCurr)<->$T)).
% 121.54/120.57  all VarCurr (-v2967(VarCurr)-> (v2987(VarCurr)<->v2932(VarCurr))).
% 121.54/120.57  all VarCurr (v2967(VarCurr)-> (v2987(VarCurr)<->$F)).
% 121.54/120.57  all VarCurr (-v2986(VarCurr)-> (v2985(VarCurr)<->$F)).
% 121.54/120.57  all VarCurr (v2986(VarCurr)-> (v2985(VarCurr)<->$T)).
% 121.54/120.57  all VarCurr (v2986(VarCurr)<-> (v2867(VarCurr)<->$T)).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2947(VarNext)<->v2948(VarNext)&v2957(VarNext))).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2957(VarNext)<->v2955(VarCurr))).
% 121.54/120.57  all VarCurr (v2955(VarCurr)<->v2832(VarCurr)|v2958(VarCurr)).
% 121.54/120.57  all VarCurr (v2958(VarCurr)<->v2959(VarCurr)&v2979(VarCurr)).
% 121.54/120.57  all VarCurr (-v2979(VarCurr)<->v2832(VarCurr)).
% 121.54/120.57  all VarCurr (v2959(VarCurr)<->v2960(VarCurr)|v2977(VarCurr)).
% 121.54/120.57  all VarCurr (v2977(VarCurr)<->v2895(VarCurr)&v2978(VarCurr)).
% 121.54/120.57  all VarCurr (-v2978(VarCurr)<->v2897(VarCurr)).
% 121.54/120.57  all VarCurr (v2960(VarCurr)<->v2961(VarCurr)&v2897(VarCurr)).
% 121.54/120.57  all VarCurr (v2961(VarCurr)<->v2962(VarCurr)|v2971(VarCurr)).
% 121.54/120.57  all VarCurr (v2971(VarCurr)<->v2972(VarCurr)&v2976(VarCurr)).
% 121.54/120.57  all VarCurr (v2976(VarCurr)<-> (v2964(VarCurr,bitIndex2)<->$F)& (v2964(VarCurr,bitIndex1)<->$F)& (v2964(VarCurr,bitIndex0)<->$T)).
% 121.54/120.57  all VarCurr (v2972(VarCurr)<->v2973(VarCurr)|v2974(VarCurr)).
% 121.54/120.57  all VarCurr (v2974(VarCurr)<->v2895(VarCurr)&v2975(VarCurr)).
% 121.54/120.57  all VarCurr (-v2975(VarCurr)<->v2973(VarCurr)).
% 121.54/120.57  all VarCurr (v2973(VarCurr)<-> (v2867(VarCurr)<->$T)).
% 121.54/120.57  all VarCurr (v2962(VarCurr)<->v2963(VarCurr)|v2965(VarCurr)).
% 121.54/120.57  all VarCurr (v2965(VarCurr)<->v2966(VarCurr)&v2970(VarCurr)).
% 121.54/120.57  all VarCurr (v2970(VarCurr)<-> (v2964(VarCurr,bitIndex2)<->$F)& (v2964(VarCurr,bitIndex1)<->$T)& (v2964(VarCurr,bitIndex0)<->$F)).
% 121.54/120.57  all VarCurr (v2966(VarCurr)<->v2967(VarCurr)|v2968(VarCurr)).
% 121.54/120.57  all VarCurr (v2968(VarCurr)<->v2895(VarCurr)&v2969(VarCurr)).
% 121.54/120.57  all VarCurr (-v2969(VarCurr)<->v2967(VarCurr)).
% 121.54/120.57  all VarCurr (v2967(VarCurr)<-> (v2867(VarCurr)<->$T)).
% 121.54/120.57  all VarCurr (v2963(VarCurr)<-> (v2964(VarCurr,bitIndex2)<->$T)& (v2964(VarCurr,bitIndex1)<->$F)& (v2964(VarCurr,bitIndex0)<->$F)).
% 121.54/120.57  all VarCurr (v2964(VarCurr,bitIndex0)<->v2865(VarCurr)).
% 121.54/120.57  all VarCurr (v2964(VarCurr,bitIndex1)<->v2863(VarCurr)).
% 121.54/120.57  all VarCurr (v2964(VarCurr,bitIndex2)<->v2861(VarCurr)).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2948(VarNext)<->v2949(VarNext)&v2934(VarNext))).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2949(VarNext)<->v2951(VarNext))).
% 121.54/120.57  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2951(VarNext)<->v2934(VarCurr))).
% 121.54/120.57  all VarCurr (v2934(VarCurr)<->v2936(VarCurr)).
% 121.54/120.57  all VarCurr (v2936(VarCurr)<->v2938(VarCurr)).
% 121.54/120.57  all VarCurr (v2938(VarCurr)<->v2940(VarCurr)).
% 121.54/120.57  all VarCurr (v2940(VarCurr)<->v2942(VarCurr)).
% 121.54/120.57  all VarCurr (v2942(VarCurr)<->v2944(VarCurr)).
% 121.54/120.57  all VarCurr (v2944(VarCurr)<->v2551(VarCurr)).
% 121.54/120.57  all VarCurr (v2932(VarCurr)<->$F).
% 121.54/120.57  all VarCurr (v2897(VarCurr)<->v2899(VarCurr)).
% 121.54/120.57  all VarCurr (v2899(VarCurr)<->v2901(VarCurr)).
% 121.54/120.57  all VarCurr (v2901(VarCurr)<->v2903(VarCurr)).
% 121.54/120.57  all VarCurr (v2903(VarCurr)<->v2905(VarCurr)&v2911(VarCurr)).
% 121.54/120.57  all VarCurr (v2911(VarCurr)<->v2913(VarCurr)).
% 121.54/120.57  all VarCurr (v2913(VarCurr)<->v2915(VarCurr)).
% 121.54/120.57  all VarCurr (v2915(VarCurr)<->v2917(VarCurr)).
% 121.54/120.57  all VarCurr (v2917(VarCurr)<->v2919(VarCurr)).
% 121.54/120.57  all VarCurr (v2919(VarCurr)<->v2921(VarCurr)).
% 121.54/120.57  all VarCurr (v2921(VarCurr)<->v2923(VarCurr)).
% 121.54/120.57  all VarCurr (v2923(VarCurr)<->v2925(VarCurr)).
% 121.54/120.57  all VarCurr (v2925(VarCurr)<->v2927(VarCurr)).
% 121.54/120.57  all VarCurr (v2927(VarCurr)<->v2929(VarCurr)).
% 121.54/120.57  v2929(constB0)<->$F.
% 121.54/120.57  all VarCurr (v2905(VarCurr)<->v2907(VarCurr)).
% 121.54/120.57  all VarCurr (v2907(VarCurr)<->v2909(VarCurr)).
% 121.54/120.57  v2909(constB0)<->$F.
% 121.54/120.57  all VarCurr (v2895(VarCurr)<->$F).
% 121.54/120.57  all VarCurr (v2867(VarCurr)<->v2869(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2869(VarCurr,bitIndex12)<->v2871(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2871(VarCurr,bitIndex12)<->v2873(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2873(VarCurr,bitIndex12)<->v2875(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2875(VarCurr,bitIndex12)<->v2877(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2877(VarCurr,bitIndex12)<->v2879(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2879(VarCurr,bitIndex12)<->v2881(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2881(VarCurr,bitIndex12)<->v2883(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2883(VarCurr,bitIndex12)<->v2885(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2885(VarCurr,bitIndex12)<->v2887(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2887(VarCurr,bitIndex12)<->v2889(VarCurr,bitIndex12)).
% 121.54/120.57  all VarCurr (v2889(VarCurr,bitIndex12)<->v2891(VarCurr,bitIndex12)).
% 121.54/120.57  -v2891(constB0,bitIndex12).
% 121.54/120.57  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx(bitIndex12).
% 121.54/120.57  all VarCurr (v2865(VarCurr)<->$F).
% 121.54/120.57  all VarCurr (v2863(VarCurr)<->$F).
% 121.54/120.57  all VarCurr (v2861(VarCurr)<->$T).
% 121.54/120.57  all VarCurr (v2859(VarCurr)<->$F).
% 121.54/120.57  all VarCurr (v2832(VarCurr)<->v2834(VarCurr)).
% 121.54/120.57  all VarCurr (-v2834(VarCurr)<->v2836(VarCurr)).
% 121.54/120.57  all VarCurr (v2836(VarCurr)<->v2838(VarCurr)).
% 121.54/120.57  all VarCurr (v2838(VarCurr)<->v2840(VarCurr)).
% 121.54/120.57  all VarCurr (v2840(VarCurr)<->v2842(VarCurr)).
% 121.54/120.57  all VarCurr (v2842(VarCurr)<->v2844(VarCurr)).
% 121.54/120.57  all VarCurr (v2844(VarCurr)<->v2846(VarCurr)).
% 121.54/120.57  all VarCurr (v2846(VarCurr)<->v2848(VarCurr)).
% 121.54/120.57  all VarCurr (v2848(VarCurr)<->v2850(VarCurr)).
% 121.54/120.57  all VarCurr (v2850(VarCurr)<->v2852(VarCurr)).
% 121.54/120.57  all VarCurr (v2852(VarCurr)<->v2854(VarCurr)).
% 121.54/120.57  all VarCurr (v2854(VarCurr)<->v2856(VarCurr)).
% 121.54/120.57  all VarCurr (v2856(VarCurr)<->v713(VarCurr)).
% 121.54/120.57  all VarCurr (v2800(VarCurr)<->v2802(VarCurr)).
% 121.54/120.57  all VarCurr (v2802(VarCurr)<->v2804(VarCurr)).
% 121.54/120.57  all VarCurr (v2804(VarCurr)<->v2806(VarCurr)).
% 121.54/120.57  all VarCurr (v2806(VarCurr)<->v2808(VarCurr)).
% 121.54/120.57  all VarCurr (v2808(VarCurr)<->v703(VarCurr,bitIndex0)).
% 121.54/120.57  all VarCurr (v2587(VarCurr,bitIndex1)<->v2770(VarCurr,bitIndex1)).
% 121.54/120.57  all VarCurr (-v2771(VarCurr)& -v2779(VarCurr)-> (all B (range_2_0(B)-> (v2770(VarCurr,B)<->v2787(VarCurr,B))))).
% 121.54/120.57  all VarCurr (v2779(VarCurr)-> (all B (range_2_0(B)-> (v2770(VarCurr,B)<->v2780(VarCurr,B))))).
% 121.54/120.57  all VarCurr (v2771(VarCurr)-> (all B (range_2_0(B)-> (v2770(VarCurr,B)<->v2774(VarCurr,B))))).
% 121.54/120.57  all VarCurr (-v2788(VarCurr)-> (all B (range_2_0(B)-> (v2787(VarCurr,B)<->$T)))).
% 121.54/120.57  all VarCurr (v2788(VarCurr)-> (all B (range_2_0(B)-> (v2787(VarCurr,B)<->$F)))).
% 121.54/120.57  all VarCurr (v2789(VarCurr)<->v2791(VarCurr)|v2794(VarCurr)).
% 121.54/120.57  all VarCurr (v2794(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2791(VarCurr)<->v2792(VarCurr)|v2793(VarCurr)).
% 121.54/120.58  all VarCurr (v2793(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2792(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2788(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2786(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$T)& (v2589(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (-v2781(VarCurr)& -v2784(VarCurr)-> (all B (range_2_0(B)-> (v2780(VarCurr,B)<->b011(B))))).
% 121.54/120.58  all VarCurr (v2784(VarCurr)-> (all B (range_2_0(B)-> (v2780(VarCurr,B)<->$T)))).
% 121.54/120.58  all VarCurr (v2781(VarCurr)-> (all B (range_2_0(B)-> (v2780(VarCurr,B)<->$F)))).
% 121.54/120.58  all VarCurr (v2785(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2784(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2781(VarCurr)<->v2782(VarCurr)|v2783(VarCurr)).
% 121.54/120.58  all VarCurr (v2783(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2782(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2779(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$T)& (v2589(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (-v2775(VarCurr)& -v2776(VarCurr)& -v2777(VarCurr)-> (all B (range_2_0(B)-> (v2774(VarCurr,B)<->$T)))).
% 121.54/120.58  all VarCurr (v2777(VarCurr)-> (all B (range_2_0(B)-> (v2774(VarCurr,B)<->b011(B))))).
% 121.54/120.58  -b011(bitIndex2).
% 121.54/120.58  b011(bitIndex1).
% 121.54/120.58  b011(bitIndex0).
% 121.54/120.58  all VarCurr (v2776(VarCurr)-> (all B (range_2_0(B)-> (v2774(VarCurr,B)<->$T)))).
% 121.54/120.58  all VarCurr (v2775(VarCurr)-> (all B (range_2_0(B)-> (v2774(VarCurr,B)<->$F)))).
% 121.54/120.58  all VarCurr (v2778(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2777(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$T)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2776(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2775(VarCurr)<-> (v2595(VarCurr,bitIndex1)<->$F)& (v2595(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2771(VarCurr)<->v2772(VarCurr)|v2773(VarCurr)).
% 121.54/120.58  all VarCurr (v2773(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$F)& (v2589(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2772(VarCurr)<-> (v2589(VarCurr,bitIndex1)<->$F)& (v2589(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all B (range_1_0(B)-> (v2589(constB0,B)<->$F)).
% 121.54/120.58  all VarCurr (v2607(VarCurr)<->v2609(VarCurr)).
% 121.54/120.58  all VarCurr (v2609(VarCurr)<->v2611(VarCurr)).
% 121.54/120.58  all VarCurr (v2611(VarCurr)<->v2613(VarCurr)).
% 121.54/120.58  all VarCurr (v2613(VarCurr)<->v2615(VarCurr)).
% 121.54/120.58  all VarCurr (v2615(VarCurr)<->v2617(VarCurr)).
% 121.54/120.58  all VarCurr (v2617(VarCurr)<->v2619(VarCurr)).
% 121.54/120.58  all VarCurr (v2619(VarCurr)<->v2621(VarCurr)).
% 121.54/120.58  all VarCurr (v2621(VarCurr)<->v2623(VarCurr)).
% 121.54/120.58  all VarCurr (v2623(VarCurr)<->v2625(VarCurr)).
% 121.54/120.58  all VarCurr (v2625(VarCurr)<->v2627(VarCurr)).
% 121.54/120.58  all VarCurr (v2627(VarCurr)<->v2629(VarCurr)).
% 121.54/120.58  all VarCurr (v2629(VarCurr)<->v2631(VarCurr,bitIndex2)).
% 121.54/120.58  all VarNext (v2631(VarNext,bitIndex2)<->v2742(VarNext,bitIndex2)).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2743(VarNext)-> (all B (range_3_0(B)-> (v2742(VarNext,B)<->v2631(VarCurr,B)))))).
% 121.54/120.58  all VarNext (v2743(VarNext)-> (all B (range_3_0(B)-> (v2742(VarNext,B)<->v2753(VarNext,B))))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v2753(VarNext,B)<->v2751(VarCurr,B))))).
% 121.54/120.58  all VarCurr (-v2754(VarCurr)-> (all B (range_3_0(B)-> (v2751(VarCurr,B)<->v2641(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2754(VarCurr)-> (all B (range_3_0(B)-> (v2751(VarCurr,B)<->$F)))).
% 121.54/120.58  all VarCurr (-v2754(VarCurr)<->v2633(VarCurr)).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2743(VarNext)<->v2744(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2744(VarNext)<->v2745(VarNext)&v2734(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2745(VarNext)<->v2747(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2747(VarNext)<->v2734(VarCurr))).
% 121.54/120.58  all VarCurr (v2734(VarCurr)<->v2736(VarCurr)).
% 121.54/120.58  all VarCurr (v2736(VarCurr)<->v2738(VarCurr)).
% 121.54/120.58  all VarCurr (v2738(VarCurr)<->v2740(VarCurr)).
% 121.54/120.58  all VarCurr (v2740(VarCurr)<->v1(VarCurr)).
% 121.54/120.58  all VarCurr (v2641(VarCurr,bitIndex2)<->v2719(VarCurr,bitIndex2)).
% 121.54/120.58  all VarCurr (-v2720(VarCurr)-> (all B (range_3_0(B)-> (v2719(VarCurr,B)<->v2721(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2720(VarCurr)-> (all B (range_3_0(B)-> (v2719(VarCurr,B)<->$F)))).
% 121.54/120.58  all VarCurr (-v2722(VarCurr)& -v2724(VarCurr)& -v2728(VarCurr)-> (all B (range_3_0(B)-> (v2721(VarCurr,B)<->v2631(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2728(VarCurr)-> (all B (range_3_0(B)-> (v2721(VarCurr,B)<->v2730(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2724(VarCurr)-> (all B (range_3_0(B)-> (v2721(VarCurr,B)<->v2726(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2722(VarCurr)-> (all B (range_3_0(B)-> (v2721(VarCurr,B)<->v2631(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2731(VarCurr)<-> (v2732(VarCurr,bitIndex1)<->$T)& (v2732(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2732(VarCurr,bitIndex0)<->v2665(VarCurr)).
% 121.54/120.58  all VarCurr (v2732(VarCurr,bitIndex1)<->v2643(VarCurr)).
% 121.54/120.58  all VarCurr (v2730(VarCurr,bitIndex0)<->$T).
% 121.54/120.58  all VarCurr B (range_3_1(B)-> (v2730(VarCurr,B)<->v2714(VarCurr,B))).
% 121.54/120.58  all VarCurr (v2728(VarCurr)<-> (v2729(VarCurr,bitIndex1)<->$T)& (v2729(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2729(VarCurr,bitIndex0)<->v2665(VarCurr)).
% 121.54/120.58  all VarCurr (v2729(VarCurr,bitIndex1)<->v2643(VarCurr)).
% 121.54/120.58  all VarCurr ((v2726(VarCurr,bitIndex2)<->v2631(VarCurr,bitIndex3))& (v2726(VarCurr,bitIndex1)<->v2631(VarCurr,bitIndex2))& (v2726(VarCurr,bitIndex0)<->v2631(VarCurr,bitIndex1))).
% 121.54/120.58  all VarCurr (v2726(VarCurr,bitIndex3)<->$F).
% 121.54/120.58  all VarCurr (v2724(VarCurr)<-> (v2725(VarCurr,bitIndex1)<->$F)& (v2725(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2725(VarCurr,bitIndex0)<->v2665(VarCurr)).
% 121.54/120.58  all VarCurr (v2725(VarCurr,bitIndex1)<->v2643(VarCurr)).
% 121.54/120.58  all VarCurr (v2722(VarCurr)<-> (v2723(VarCurr,bitIndex1)<->$F)& (v2723(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2723(VarCurr,bitIndex0)<->v2665(VarCurr)).
% 121.54/120.58  all VarCurr (v2723(VarCurr,bitIndex1)<->v2643(VarCurr)).
% 121.54/120.58  all VarCurr (-v2720(VarCurr)<->v2633(VarCurr)).
% 121.54/120.58  all VarCurr (v2714(VarCurr,bitIndex2)<->v2715(VarCurr,bitIndex2)).
% 121.54/120.58  all VarCurr (v2715(VarCurr,bitIndex0)<->$F).
% 121.54/120.58  all VarCurr ((v2715(VarCurr,bitIndex3)<->v2631(VarCurr,bitIndex2))& (v2715(VarCurr,bitIndex2)<->v2631(VarCurr,bitIndex1))& (v2715(VarCurr,bitIndex1)<->v2631(VarCurr,bitIndex0))).
% 121.54/120.58  -v2631(constB0,bitIndex3).
% 121.54/120.58  -v2631(constB0,bitIndex2).
% 121.54/120.58  -v2631(constB0,bitIndex1).
% 121.54/120.58  -b000x(bitIndex3).
% 121.54/120.58  -b000x(bitIndex2).
% 121.54/120.58  -b000x(bitIndex1).
% 121.54/120.58  all VarCurr (v2665(VarCurr)<->v2667(VarCurr)).
% 121.54/120.58  all VarCurr (v2667(VarCurr)<->v2669(VarCurr)).
% 121.54/120.58  all VarCurr (-v2706(VarCurr)& -v2709(VarCurr)-> (v2669(VarCurr)<->$F)).
% 121.54/120.58  all VarCurr (v2709(VarCurr)-> (v2669(VarCurr)<->v2710(VarCurr))).
% 121.54/120.58  all VarCurr (v2706(VarCurr)-> (v2669(VarCurr)<->$F)).
% 121.54/120.58  all VarCurr (-v2711(VarCurr)-> (v2710(VarCurr)<->$T)).
% 121.54/120.58  all VarCurr (v2711(VarCurr)-> (v2710(VarCurr)<->$F)).
% 121.54/120.58  all VarCurr (v2712(VarCurr)<-> (v2673(VarCurr)<->$F)).
% 121.54/120.58  all VarCurr (v2711(VarCurr)<-> (v2673(VarCurr)<->$T)).
% 121.54/120.58  all VarCurr (v2709(VarCurr)<-> (v2671(VarCurr,bitIndex1)<->$T)& (v2671(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2706(VarCurr)<->v2707(VarCurr)|v2708(VarCurr)).
% 121.54/120.58  all VarCurr (v2708(VarCurr)<-> (v2671(VarCurr,bitIndex1)<->$F)& (v2671(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2707(VarCurr)<-> (v2671(VarCurr,bitIndex1)<->$F)& (v2671(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all B (range_1_0(B)-> (v2671(constB0,B)<->$F)).
% 121.54/120.58  all VarCurr (v2673(VarCurr)<->v2675(VarCurr)).
% 121.54/120.58  all VarCurr (v2675(VarCurr)<->v2701(VarCurr)&v2697(VarCurr)).
% 121.54/120.58  all VarCurr (v2701(VarCurr)<->v2702(VarCurr)&v2693(VarCurr)).
% 121.54/120.58  all VarCurr (v2702(VarCurr)<->v2703(VarCurr)&v2689(VarCurr)).
% 121.54/120.58  all VarCurr (v2703(VarCurr)<->v2704(VarCurr)&v2685(VarCurr)).
% 121.54/120.58  all VarCurr (v2704(VarCurr)<->v2677(VarCurr)&v2681(VarCurr)).
% 121.54/120.58  all VarCurr (v2697(VarCurr)<->v2699(VarCurr)).
% 121.54/120.58  v2699(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2693(VarCurr)<->v2695(VarCurr)).
% 121.54/120.58  v2695(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2689(VarCurr)<->v2691(VarCurr)).
% 121.54/120.58  v2691(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2685(VarCurr)<->v2687(VarCurr)).
% 121.54/120.58  v2687(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2681(VarCurr)<->v2683(VarCurr)).
% 121.54/120.58  v2683(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2677(VarCurr)<->v2679(VarCurr)).
% 121.54/120.58  v2679(constB0)<->$T.
% 121.54/120.58  all VarCurr (v2643(VarCurr)<->v2645(VarCurr)).
% 121.54/120.58  all VarCurr (v2645(VarCurr)<->v2647(VarCurr)).
% 121.54/120.58  all VarCurr (v2647(VarCurr)<->v2649(VarCurr)).
% 121.54/120.58  all VarCurr (v2649(VarCurr)<->v2651(VarCurr)).
% 121.54/120.58  all VarCurr (v2651(VarCurr)<->v2653(VarCurr)).
% 121.54/120.58  all VarCurr (v2653(VarCurr)<->v2655(VarCurr)).
% 121.54/120.58  all VarCurr (v2655(VarCurr)<->v2657(VarCurr)).
% 121.54/120.58  all VarCurr (v2657(VarCurr)<->v2659(VarCurr)).
% 121.54/120.58  all VarCurr (v2659(VarCurr)<->v2661(VarCurr)).
% 121.54/120.58  all VarCurr (v2661(VarCurr)<->v2663(VarCurr)).
% 121.54/120.58  all VarCurr (v2633(VarCurr)<->v2635(VarCurr)).
% 121.54/120.58  all VarCurr (v2635(VarCurr)<->v2637(VarCurr)).
% 121.54/120.58  all VarCurr (v2637(VarCurr)<->v2639(VarCurr)).
% 121.54/120.58  all VarCurr (v2639(VarCurr)<->v14(VarCurr)).
% 121.54/120.58  all VarCurr (v2591(VarCurr)<->v2593(VarCurr)).
% 121.54/120.58  all VarCurr (v2593(VarCurr)<->v713(VarCurr)).
% 121.54/120.58  all VarCurr (v703(VarCurr,bitIndex0)<->v705(VarCurr,bitIndex0)).
% 121.54/120.58  all VarCurr (v705(VarCurr,bitIndex0)<->v707(VarCurr,bitIndex0)).
% 121.54/120.58  all VarNext (v707(VarNext,bitIndex0)<->v2569(VarNext,bitIndex0)).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2570(VarNext)-> (all B (range_3_0(B)-> (v2569(VarNext,B)<->v707(VarCurr,B)))))).
% 121.54/120.58  all VarNext (v2570(VarNext)-> (all B (range_3_0(B)-> (v2569(VarNext,B)<->v2564(VarNext,B))))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2570(VarNext)<->v2571(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2571(VarNext)<->v2573(VarNext)&v2547(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2573(VarNext)<->v2558(VarNext))).
% 121.54/120.58  all VarCurr (v715(VarCurr,bitIndex0)<->v2535(VarCurr,bitIndex0)).
% 121.54/120.58  all VarNext (v707(VarNext,bitIndex1)<->v2553(VarNext,bitIndex1)).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2554(VarNext)-> (all B (range_3_0(B)-> (v2553(VarNext,B)<->v707(VarCurr,B)))))).
% 121.54/120.58  all VarNext (v2554(VarNext)-> (all B (range_3_0(B)-> (v2553(VarNext,B)<->v2564(VarNext,B))))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v2564(VarNext,B)<->v2562(VarCurr,B))))).
% 121.54/120.58  all VarCurr (-v2565(VarCurr)-> (all B (range_3_0(B)-> (v2562(VarCurr,B)<->v715(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2565(VarCurr)-> (all B (range_3_0(B)-> (v2562(VarCurr,B)<->$F)))).
% 121.54/120.58  all VarCurr (-v2565(VarCurr)<->v709(VarCurr)).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2554(VarNext)<->v2555(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2555(VarNext)<->v2556(VarNext)&v2547(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2556(VarNext)<->v2558(VarNext))).
% 121.54/120.58  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2558(VarNext)<->v2547(VarCurr))).
% 121.54/120.58  all VarCurr (v2547(VarCurr)<->v2549(VarCurr)).
% 121.54/120.58  all VarCurr (v2549(VarCurr)<->v2551(VarCurr)).
% 121.54/120.58  all VarCurr (v2551(VarCurr)<->v1(VarCurr)).
% 121.54/120.58  all VarCurr (v715(VarCurr,bitIndex1)<->v2535(VarCurr,bitIndex1)).
% 121.54/120.58  all VarCurr (-v2536(VarCurr)& -v2538(VarCurr)& -v2541(VarCurr)-> (all B (range_3_0(B)-> (v2535(VarCurr,B)<->v707(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2541(VarCurr)-> (all B (range_3_0(B)-> (v2535(VarCurr,B)<->v2543(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2538(VarCurr)-> (all B (range_3_0(B)-> (v2535(VarCurr,B)<->v2540(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2536(VarCurr)-> (all B (range_3_0(B)-> (v2535(VarCurr,B)<->v707(VarCurr,B))))).
% 121.54/120.58  all VarCurr (v2544(VarCurr)<-> (v2545(VarCurr,bitIndex1)<->$T)& (v2545(VarCurr,bitIndex0)<->$T)).
% 121.54/120.58  all VarCurr (v2545(VarCurr,bitIndex0)<->v2530(VarCurr)).
% 121.54/120.58  all VarCurr (v2545(VarCurr,bitIndex1)<->v717(VarCurr)).
% 121.54/120.58  all VarCurr (v2543(VarCurr,bitIndex0)<->$T).
% 121.54/120.58  all VarCurr ((v2543(VarCurr,bitIndex3)<->v707(VarCurr,bitIndex2))& (v2543(VarCurr,bitIndex2)<->v707(VarCurr,bitIndex1))& (v2543(VarCurr,bitIndex1)<->v707(VarCurr,bitIndex0))).
% 121.54/120.58  all VarCurr (v2541(VarCurr)<-> (v2542(VarCurr,bitIndex1)<->$T)& (v2542(VarCurr,bitIndex0)<->$F)).
% 121.54/120.58  all VarCurr (v2542(VarCurr,bitIndex0)<->v2530(VarCurr)).
% 121.54/120.58  all VarCurr (v2542(VarCurr,bitIndex1)<->v717(VarCurr)).
% 121.54/120.58  all VarCurr ((v2540(VarCurr,bitIndex2)<->v707(VarCurr,bitIndex3))& (v2540(VarCurr,bitIndex1)<->v707(VarCurr,bitIndex2))& (v2540(VarCurr,bitIndex0)<->v707(VarCurr,bitIndex1))).
% 121.54/120.58  all VarCurr (v2540(VarCurr,bitIndex3)<->$F).
% 121.54/120.58  all VarCurr (v2538(VarCurr)<-> (v2539(VarCurr,bitIndex1)<->$F)& (v2539(VarCurr,bitIndex0)<->$T)).
% 121.54/120.59  all VarCurr (v2539(VarCurr,bitIndex0)<->v2530(VarCurr)).
% 121.54/120.59  all VarCurr (v2539(VarCurr,bitIndex1)<->v717(VarCurr)).
% 121.54/120.59  all B (range_3_0(B)-> (v707(constB0,B)<->$F)).
% 121.54/120.59  all VarCurr (v2536(VarCurr)<-> (v2537(VarCurr,bitIndex1)<->$F)& (v2537(VarCurr,bitIndex0)<->$F)).
% 121.54/120.59  all VarCurr (v2537(VarCurr,bitIndex0)<->v2530(VarCurr)).
% 121.54/120.59  all VarCurr (v2537(VarCurr,bitIndex1)<->v717(VarCurr)).
% 121.54/120.59  all VarCurr (v717(VarCurr)<->v719(VarCurr)).
% 121.54/120.59  all VarCurr (v719(VarCurr)<->v721(VarCurr)).
% 121.54/120.59  all VarCurr (v721(VarCurr)<->v723(VarCurr)).
% 121.54/120.59  all VarCurr (v723(VarCurr)<->v725(VarCurr)).
% 121.54/120.59  all VarCurr (v725(VarCurr)<->v727(VarCurr)).
% 121.54/120.59  all VarCurr (v727(VarCurr)<->v729(VarCurr)).
% 121.54/120.59  all VarCurr (v729(VarCurr)<->v731(VarCurr)).
% 121.54/120.59  all VarCurr (v731(VarCurr)<->v733(VarCurr)).
% 121.54/120.59  all VarCurr (v733(VarCurr)<->v735(VarCurr)).
% 121.54/120.59  all VarCurr (v735(VarCurr)<->v737(VarCurr)).
% 121.54/120.59  all VarCurr (v737(VarCurr)<->v739(VarCurr)).
% 121.54/120.59  all VarCurr (v739(VarCurr)<->v741(VarCurr)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2517(VarNext)-> (v741(VarNext)<->v741(VarCurr)))).
% 121.54/120.59  all VarNext (v2517(VarNext)-> (v741(VarNext)<->v2525(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2525(VarNext)<->v2523(VarCurr))).
% 121.54/120.59  all VarCurr (-v2526(VarCurr)-> (v2523(VarCurr)<->v745(VarCurr))).
% 121.54/120.59  all VarCurr (v2526(VarCurr)-> (v2523(VarCurr)<->$F)).
% 121.54/120.59  all VarCurr (-v2526(VarCurr)<->v743(VarCurr)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2517(VarNext)<->v2518(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2518(VarNext)<->v2519(VarNext)&v801(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2519(VarNext)<->v812(VarNext))).
% 121.54/120.59  all VarCurr (-v2514(VarCurr)-> (v745(VarCurr)<->$F)).
% 121.54/120.59  all VarCurr (v2514(VarCurr)-> (v745(VarCurr)<->$T)).
% 121.54/120.59  all VarCurr (v2514(VarCurr)<->v1210(VarCurr)|v1214(VarCurr)).
% 121.54/120.59  all VarCurr (v1173(VarCurr)<->v2511(VarCurr)&v2512(VarCurr)).
% 121.54/120.59  all VarCurr (-v2512(VarCurr)<->v2446(VarCurr)).
% 121.54/120.59  all VarCurr (v2511(VarCurr)<-> (v749(VarCurr,bitIndex7)<->v1175(VarCurr,bitIndex7))& (v749(VarCurr,bitIndex6)<->v1175(VarCurr,bitIndex6))& (v749(VarCurr,bitIndex5)<->v1175(VarCurr,bitIndex5))& (v749(VarCurr,bitIndex4)<->v1175(VarCurr,bitIndex4))& (v749(VarCurr,bitIndex3)<->v1175(VarCurr,bitIndex3))& (v749(VarCurr,bitIndex2)<->v1175(VarCurr,bitIndex2))& (v749(VarCurr,bitIndex1)<->v1175(VarCurr,bitIndex1))& (v749(VarCurr,bitIndex0)<->v1175(VarCurr,bitIndex0))).
% 121.54/120.59  all VarCurr (v2446(VarCurr)<->v2448(VarCurr)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2488(VarNext)-> (v2448(VarNext)<->v2448(VarCurr)))).
% 121.54/120.59  all VarNext (v2488(VarNext)-> (v2448(VarNext)<->v2506(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2506(VarNext)<->v2504(VarCurr))).
% 121.54/120.59  all VarCurr (-v2503(VarCurr)-> (v2504(VarCurr)<->v2507(VarCurr))).
% 121.54/120.59  all VarCurr (v2503(VarCurr)-> (v2504(VarCurr)<->$T)).
% 121.54/120.59  all VarCurr (-v1185(VarCurr)-> (v2507(VarCurr)<->$T)).
% 121.54/120.59  all VarCurr (v1185(VarCurr)-> (v2507(VarCurr)<->$F)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2488(VarNext)<->v2489(VarNext)&v2496(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2496(VarNext)<->v2494(VarCurr))).
% 121.54/120.59  all VarCurr (v2494(VarCurr)<->v2497(VarCurr)|v2503(VarCurr)).
% 121.54/120.59  all VarCurr (-v2503(VarCurr)<->v1183(VarCurr)).
% 121.54/120.59  all VarCurr (v2497(VarCurr)<->v2498(VarCurr)|v1185(VarCurr)).
% 121.54/120.59  all VarCurr (v2498(VarCurr)<->v2499(VarCurr)&v2502(VarCurr)).
% 121.54/120.59  all VarCurr (v2502(VarCurr)<-> (v2072(VarCurr,bitIndex0)<->$T)).
% 121.54/120.59  all VarCurr (v2499(VarCurr)<->v2500(VarCurr)&v2501(VarCurr)).
% 121.54/120.59  all VarCurr (v2501(VarCurr)<-> (v2450(VarCurr,bitIndex1)<->$F)).
% 121.54/120.59  all VarCurr (v2500(VarCurr)<-> (v2064(VarCurr)<->$T)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2489(VarNext)<->v2490(VarNext)&v2077(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2490(VarNext)<->v2086(VarNext))).
% 121.54/120.59  all VarCurr (v2072(VarCurr,bitIndex0)<->v2096(VarCurr,bitIndex0)).
% 121.54/120.59  all VarNext (v2450(VarNext,bitIndex1)<->v2474(VarNext,bitIndex1)).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2475(VarNext)-> (all B (range_3_0(B)-> (v2474(VarNext,B)<->v2450(VarCurr,B)))))).
% 121.54/120.59  all VarNext (v2475(VarNext)-> (all B (range_3_0(B)-> (v2474(VarNext,B)<->v2483(VarNext,B))))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v2483(VarNext,B)<->v2481(VarCurr,B))))).
% 121.54/120.59  all VarCurr (-v2093(VarCurr)-> (all B (range_3_0(B)-> (v2481(VarCurr,B)<->v2452(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2093(VarCurr)-> (all B (range_3_0(B)-> (v2481(VarCurr,B)<->$F)))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2475(VarNext)<->v2476(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2476(VarNext)<->v2478(VarNext)&v2077(VarNext))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2478(VarNext)<->v2086(VarNext))).
% 121.54/120.59  all VarCurr (v2452(VarCurr,bitIndex1)<->v2459(VarCurr,bitIndex1)).
% 121.54/120.59  all VarCurr (-v2460(VarCurr)-> (all B (range_3_0(B)-> (v2459(VarCurr,B)<->v2461(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2460(VarCurr)-> (all B (range_3_0(B)-> (v2459(VarCurr,B)<->$F)))).
% 121.54/120.59  all VarCurr (-v2462(VarCurr)& -v2464(VarCurr)& -v2468(VarCurr)-> (all B (range_3_0(B)-> (v2461(VarCurr,B)<->v2450(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2468(VarCurr)-> (all B (range_3_0(B)-> (v2461(VarCurr,B)<->v2470(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2464(VarCurr)-> (all B (range_3_0(B)-> (v2461(VarCurr,B)<->v2466(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2462(VarCurr)-> (all B (range_3_0(B)-> (v2461(VarCurr,B)<->v2450(VarCurr,B))))).
% 121.54/120.59  all VarCurr (v2471(VarCurr)<-> (v2472(VarCurr,bitIndex1)<->$T)& (v2472(VarCurr,bitIndex0)<->$T)).
% 121.54/120.59  all VarCurr (v2472(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.54/120.59  all VarCurr (v2472(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.54/120.59  all VarCurr (v2470(VarCurr,bitIndex0)<->$T).
% 121.54/120.59  all VarCurr B (range_3_1(B)-> (v2470(VarCurr,B)<->v2454(VarCurr,B))).
% 121.54/120.59  all B (range_3_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B).
% 121.54/120.59  all VarCurr (v2468(VarCurr)<-> (v2469(VarCurr,bitIndex1)<->$T)& (v2469(VarCurr,bitIndex0)<->$F)).
% 121.54/120.59  all VarCurr (v2469(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.54/120.59  all VarCurr (v2469(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.54/120.59  all VarCurr ((v2466(VarCurr,bitIndex2)<->v2450(VarCurr,bitIndex3))& (v2466(VarCurr,bitIndex1)<->v2450(VarCurr,bitIndex2))& (v2466(VarCurr,bitIndex0)<->v2450(VarCurr,bitIndex1))).
% 121.54/120.59  all VarCurr (v2466(VarCurr,bitIndex3)<->$F).
% 121.54/120.59  all VarCurr (v2464(VarCurr)<-> (v2465(VarCurr,bitIndex1)<->$F)& (v2465(VarCurr,bitIndex0)<->$T)).
% 121.54/120.59  all VarCurr (v2465(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.54/120.59  all VarCurr (v2465(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.54/120.59  all VarCurr (v2462(VarCurr)<-> (v2463(VarCurr,bitIndex1)<->$F)& (v2463(VarCurr,bitIndex0)<->$F)).
% 121.54/120.59  all VarCurr (v2463(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.54/120.59  all VarCurr (v2463(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.54/120.59  all VarCurr (-v2460(VarCurr)<->v1183(VarCurr)).
% 121.54/120.59  all VarCurr (v2454(VarCurr,bitIndex1)<->v2455(VarCurr,bitIndex1)).
% 121.54/120.59  all VarCurr (v2455(VarCurr,bitIndex0)<->$F).
% 121.54/120.59  all VarCurr ((v2455(VarCurr,bitIndex3)<->v2450(VarCurr,bitIndex2))& (v2455(VarCurr,bitIndex2)<->v2450(VarCurr,bitIndex1))& (v2455(VarCurr,bitIndex1)<->v2450(VarCurr,bitIndex0))).
% 121.54/120.59  -v2450(constB0,bitIndex2).
% 121.54/120.59  -v2450(constB0,bitIndex1).
% 121.54/120.59  -v2450(constB0,bitIndex0).
% 121.54/120.59  -bx000(bitIndex2).
% 121.54/120.59  -bx000(bitIndex1).
% 121.54/120.59  -bx000(bitIndex0).
% 121.54/120.59  all VarCurr B (range_7_0(B)-> (v1175(VarCurr,B)<->v1177(VarCurr,B))).
% 121.54/120.59  all VarCurr B (range_7_0(B)-> (v1177(VarCurr,B)<->v1179(VarCurr,B))).
% 121.54/120.59  all VarCurr ((v1179(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex400))& (v1179(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex399))& (v1179(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex398))& (v1179(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex397))& (v1179(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex396))& (v1179(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex395))& (v1179(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex394))& (v1179(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex393))).
% 121.54/120.59  all VarNext ((v1181(VarNext,bitIndex400)<->v2414(VarNext,bitIndex7))& (v1181(VarNext,bitIndex399)<->v2414(VarNext,bitIndex6))& (v1181(VarNext,bitIndex398)<->v2414(VarNext,bitIndex5))& (v1181(VarNext,bitIndex397)<->v2414(VarNext,bitIndex4))& (v1181(VarNext,bitIndex396)<->v2414(VarNext,bitIndex3))& (v1181(VarNext,bitIndex395)<->v2414(VarNext,bitIndex2))& (v1181(VarNext,bitIndex394)<->v2414(VarNext,bitIndex1))& (v1181(VarNext,bitIndex393)<->v2414(VarNext,bitIndex0))).
% 121.54/120.59  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2415(VarNext)-> (v2414(VarNext,bitIndex130)<->v1181(VarCurr,bitIndex523))& (v2414(VarNext,bitIndex129)<->v1181(VarCurr,bitIndex522))& (v2414(VarNext,bitIndex128)<->v1181(VarCurr,bitIndex521))& (v2414(VarNext,bitIndex127)<->v1181(VarCurr,bitIndex520))& (v2414(VarNext,bitIndex126)<->v1181(VarCurr,bitIndex519))& (v2414(VarNext,bitIndex125)<->v1181(VarCurr,bitIndex518))& (v2414(VarNext,bitIndex124)<->v1181(VarCurr,bitIndex517))& (v2414(VarNext,bitIndex123)<->v1181(VarCurr,bitIndex516))& (v2414(VarNext,bitIndex122)<->v1181(VarCurr,bitIndex515))& (v2414(VarNext,bitIndex121)<->v1181(VarCurr,bitIndex514))& (v2414(VarNext,bitIndex120)<->v1181(VarCurr,bitIndex513))& (v2414(VarNext,bitIndex119)<->v1181(VarCurr,bitIndex512))& (v2414(VarNext,bitIndex118)<->v1181(VarCurr,bitIndex511))& (v2414(VarNext,bitIndex117)<->v1181(VarCurr,bitIndex510))& (v2414(VarNext,bitIndex116)<->v1181(VarCurr,bitIndex509))& (v2414(VarNext,bitIndex115)<->v1181(VarCurr,bitIndex508))& (v2414(VarNext,bitIndex114)<->v1181(VarCurr,bitIndex507))& (v2414(VarNext,bitIndex113)<->v1181(VarCurr,bitIndex506))& (v2414(VarNext,bitIndex112)<->v1181(VarCurr,bitIndex505))& (v2414(VarNext,bitIndex111)<->v1181(VarCurr,bitIndex504))& (v2414(VarNext,bitIndex110)<->v1181(VarCurr,bitIndex503))& (v2414(VarNext,bitIndex109)<->v1181(VarCurr,bitIndex502))& (v2414(VarNext,bitIndex108)<->v1181(VarCurr,bitIndex501))& (v2414(VarNext,bitIndex107)<->v1181(VarCurr,bitIndex500))& (v2414(VarNext,bitIndex106)<->v1181(VarCurr,bitIndex499))& (v2414(VarNext,bitIndex105)<->v1181(VarCurr,bitIndex498))& (v2414(VarNext,bitIndex104)<->v1181(VarCurr,bitIndex497))& (v2414(VarNext,bitIndex103)<->v1181(VarCurr,bitIndex496))& (v2414(VarNext,bitIndex102)<->v1181(VarCurr,bitIndex495))& (v2414(VarNext,bitIndex101)<->v1181(VarCurr,bitIndex494))& (v2414(VarNext,bitIndex100)<->v1181(VarCurr,bitIndex493))& (v2414(VarNext,bitIndex99)<->v1181(VarCurr,bitIndex492))& (v2414(VarNext,bitIndex98)<->v1181(VarCurr,bitIndex491))& (v2414(VarNext,bitIndex97)<->v1181(VarCurr,bitIndex490))& (v2414(VarNext,bitIndex96)<->v1181(VarCurr,bitIndex489))& (v2414(VarNext,bitIndex95)<->v1181(VarCurr,bitIndex488))& (v2414(VarNext,bitIndex94)<->v1181(VarCurr,bitIndex487))& (v2414(VarNext,bitIndex93)<->v1181(VarCurr,bitIndex486))& (v2414(VarNext,bitIndex92)<->v1181(VarCurr,bitIndex485))& (v2414(VarNext,bitIndex91)<->v1181(VarCurr,bitIndex484))& (v2414(VarNext,bitIndex90)<->v1181(VarCurr,bitIndex483))& (v2414(VarNext,bitIndex89)<->v1181(VarCurr,bitIndex482))& (v2414(VarNext,bitIndex88)<->v1181(VarCurr,bitIndex481))& (v2414(VarNext,bitIndex87)<->v1181(VarCurr,bitIndex480))& (v2414(VarNext,bitIndex86)<->v1181(VarCurr,bitIndex479))& (v2414(VarNext,bitIndex85)<->v1181(VarCurr,bitIndex478))& (v2414(VarNext,bitIndex84)<->v1181(VarCurr,bitIndex477))& (v2414(VarNext,bitIndex83)<->v1181(VarCurr,bitIndex476))& (v2414(VarNext,bitIndex82)<->v1181(VarCurr,bitIndex475))& (v2414(VarNext,bitIndex81)<->v1181(VarCurr,bitIndex474))& (v2414(VarNext,bitIndex80)<->v1181(VarCurr,bitIndex473))& (v2414(VarNext,bitIndex79)<->v1181(VarCurr,bitIndex472))& (v2414(VarNext,bitIndex78)<->v1181(VarCurr,bitIndex471))& (v2414(VarNext,bitIndex77)<->v1181(VarCurr,bitIndex470))& (v2414(VarNext,bitIndex76)<->v1181(VarCurr,bitIndex469))& (v2414(VarNext,bitIndex75)<->v1181(VarCurr,bitIndex468))& (v2414(VarNext,bitIndex74)<->v1181(VarCurr,bitIndex467))& (v2414(VarNext,bitIndex73)<->v1181(VarCurr,bitIndex466))& (v2414(VarNext,bitIndex72)<->v1181(VarCurr,bitIndex465))& (v2414(VarNext,bitIndex71)<->v1181(VarCurr,bitIndex464))& (v2414(VarNext,bitIndex70)<->v1181(VarCurr,bitIndex463))& (v2414(VarNext,bitIndex69)<->v1181(VarCurr,bitIndex462))& (v2414(VarNext,bitIndex68)<->v1181(VarCurr,bitIndex461))& (v2414(VarNext,bitIndex67)<->v1181(VarCurr,bitIndex460))& (v2414(VarNext,bitIndex66)<->v1181(VarCurr,bitIndex459))& (v2414(VarNext,bitIndex65)<->v1181(VarCurr,bitIndex458))& (v2414(VarNext,bitIndex64)<->v1181(VarCurr,bitIndex457))& (v2414(VarNext,bitIndex63)<->v1181(VarCurr,bitIndex456))& (v2414(VarNext,bitIndex62)<->v1181(VarCurr,bitIndex455))& (v2414(VarNext,bitIndex61)<->v1181(VarCurr,bitIndex454))& (v2414(VarNext,bitIndex60)<->v1181(VarCurr,bitIndex453))& (v2414(VarNext,bitIndex59)<->v1181(VarCurr,bitIndex452))& (v2414(VarNext,bitIndex58)<->v1181(VarCurr,bitIndex451))& (v2414(VarNext,bitIndex57)<->v1181(VarCurr,bitIndex450))& (v2414(VarNext,bitIndex56)<->v1181(VarCurr,bitIndex449))& (v2414(VarNext,bitIndex55)<->v1181(VarCurr,bitIndex448))& (v2414(VarNext,bitIndex54)<->v1181(VarCurr,bitIndex447))& (v2414(VarNext,bitIndex53)<->v1181(VarCurr,bitIndex446))& (v2414(VarNext,bitIndex52)<->v1181(VarCurr,bitIndex445))& (v2414(VarNext,bitIndex51)<->v1181(VarCurr,bitIndex444))& (v2414(VarNext,bitIndex50)<->v1181(VarCurr,bitIndex443))& (v2414(VarNext,bitIndex49)<->v1181(VarCurr,bitIndex442))& (v2414(VarNext,bitIndex48)<->v1181(VarCurr,bitIndex441))& (v2414(VarNext,bitIndex47)<->v1181(VarCurr,bitIndex440))& (v2414(VarNext,bitIndex46)<->v1181(VarCurr,bitIndex439))& (v2414(VarNext,bitIndex45)<->v1181(VarCurr,bitIndex438))& (v2414(VarNext,bitIndex44)<->v1181(VarCurr,bitIndex437))& (v2414(VarNext,bitIndex43)<->v1181(VarCurr,bitIndex436))& (v2414(VarNext,bitIndex42)<->v1181(VarCurr,bitIndex435))& (v2414(VarNext,bitIndex41)<->v1181(VarCurr,bitIndex434))& (v2414(VarNext,bitIndex40)<->v1181(VarCurr,bitIndex433))& (v2414(VarNext,bitIndex39)<->v1181(VarCurr,bitIndex432))& (v2414(VarNext,bitIndex38)<->v1181(VarCurr,bitIndex431))& (v2414(VarNext,bitIndex37)<->v1181(VarCurr,bitIndex430))& (v2414(VarNext,bitIndex36)<->v1181(VarCurr,bitIndex429))& (v2414(VarNext,bitIndex35)<->v1181(VarCurr,bitIndex428))& (v2414(VarNext,bitIndex34)<->v1181(VarCurr,bitIndex427))& (v2414(VarNext,bitIndex33)<->v1181(VarCurr,bitIndex426))& (v2414(VarNext,bitIndex32)<->v1181(VarCurr,bitIndex425))& (v2414(VarNext,bitIndex31)<->v1181(VarCurr,bitIndex424))& (v2414(VarNext,bitIndex30)<->v1181(VarCurr,bitIndex423))& (v2414(VarNext,bitIndex29)<->v1181(VarCurr,bitIndex422))& (v2414(VarNext,bitIndex28)<->v1181(VarCurr,bitIndex421))& (v2414(VarNext,bitIndex27)<->v1181(VarCurr,bitIndex420))& (v2414(VarNext,bitIndex26)<->v1181(VarCurr,bitIndex419))& (v2414(VarNext,bitIndex25)<->v1181(VarCurr,bitIndex418))& (v2414(VarNext,bitIndex24)<->v1181(VarCurr,bitIndex417))& (v2414(VarNext,bitIndex23)<->v1181(VarCurr,bitIndex416))& (v2414(VarNext,bitIndex22)<->v1181(VarCurr,bitIndex415))& (v2414(VarNext,bitIndex21)<->v1181(VarCurr,bitIndex414))& (v2414(VarNext,bitIndex20)<->v1181(VarCurr,bitIndex413))& (v2414(VarNext,bitIndex19)<->v1181(VarCurr,bitIndex412))& (v2414(VarNext,bitIndex18)<->v1181(VarCurr,bitIndex411))& (v2414(VarNext,bitIndex17)<->v1181(VarCurr,bitIndex410))& (v2414(VarNext,bitIndex16)<->v1181(VarCurr,bitIndex409))& (v2414(VarNext,bitIndex15)<->v1181(VarCurr,bitIndex408))& (v2414(VarNext,bitIndex14)<->v1181(VarCurr,bitIndex407))& (v2414(VarNext,bitIndex13)<->v1181(VarCurr,bitIndex406))& (v2414(VarNext,bitIndex12)<->v1181(VarCurr,bitIndex405))& (v2414(VarNext,bitIndex11)<->v1181(VarCurr,bitIndex404))& (v2414(VarNext,bitIndex10)<->v1181(VarCurr,bitIndex403))& (v2414(VarNext,bitIndex9)<->v1181(VarCurr,bitIndex402))& (v2414(VarNext,bitIndex8)<->v1181(VarCurr,bitIndex401))& (v2414(VarNext,bitIndex7)<->v1181(VarCurr,bitIndex400))& (v2414(VarNext,bitIndex6)<->v1181(VarCurr,bitIndex399))& (v2414(VarNext,bitIndex5)<->v1181(VarCurr,bitIndex398))& (v2414(VarNext,bitIndex4)<->v1181(VarCurr,bitIndex397))& (v2414(VarNext,bitIndex3)<->v1181(VarCurr,bitIndex396))& (v2414(VarNext,bitIndex2)<->v1181(VarCurr,bitIndex395))& (v2414(VarNext,bitIndex1)<->v1181(VarCurr,bitIndex394))& (v2414(VarNext,bitIndex0)<->v1181(VarCurr,bitIndex393)))).
% 121.63/120.60  all VarNext (v2415(VarNext)-> (all B (range_130_0(B)-> (v2414(VarNext,B)<->v2441(VarNext,B))))).
% 121.63/120.60  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v2441(VarNext,B)<->v2439(VarCurr,B))))).
% 121.63/120.60  all VarCurr (-v2378(VarCurr)-> (all B (range_130_0(B)-> (v2439(VarCurr,B)<->v2442(VarCurr,B))))).
% 121.63/120.60  all VarCurr (v2378(VarCurr)-> (all B (range_130_0(B)-> (v2439(VarCurr,B)<->$F)))).
% 121.63/120.60  all VarCurr (-v2428(VarCurr)& -v2430(VarCurr)-> (all B (range_130_0(B)-> (v2442(VarCurr,B)<->v2407(VarCurr,B))))).
% 121.63/120.60  all VarCurr (v2430(VarCurr)-> (all B (range_130_0(B)-> (v2442(VarCurr,B)<->v2400(VarCurr,B))))).
% 121.63/120.60  all VarCurr (v2428(VarCurr)-> (v2442(VarCurr,bitIndex130)<->v1181(VarCurr,bitIndex392))& (v2442(VarCurr,bitIndex129)<->v1181(VarCurr,bitIndex391))& (v2442(VarCurr,bitIndex128)<->v1181(VarCurr,bitIndex390))& (v2442(VarCurr,bitIndex127)<->v1181(VarCurr,bitIndex389))& (v2442(VarCurr,bitIndex126)<->v1181(VarCurr,bitIndex388))& (v2442(VarCurr,bitIndex125)<->v1181(VarCurr,bitIndex387))& (v2442(VarCurr,bitIndex124)<->v1181(VarCurr,bitIndex386))& (v2442(VarCurr,bitIndex123)<->v1181(VarCurr,bitIndex385))& (v2442(VarCurr,bitIndex122)<->v1181(VarCurr,bitIndex384))& (v2442(VarCurr,bitIndex121)<->v1181(VarCurr,bitIndex383))& (v2442(VarCurr,bitIndex120)<->v1181(VarCurr,bitIndex382))& (v2442(VarCurr,bitIndex119)<->v1181(VarCurr,bitIndex381))& (v2442(VarCurr,bitIndex118)<->v1181(VarCurr,bitIndex380))& (v2442(VarCurr,bitIndex117)<->v1181(VarCurr,bitIndex379))& (v2442(VarCurr,bitIndex116)<->v1181(VarCurr,bitIndex378))& (v2442(VarCurr,bitIndex115)<->v1181(VarCurr,bitIndex377))& (v2442(VarCurr,bitIndex114)<->v1181(VarCurr,bitIndex376))& (v2442(VarCurr,bitIndex113)<->v1181(VarCurr,bitIndex375))& (v2442(VarCurr,bitIndex112)<->v1181(VarCurr,bitIndex374))& (v2442(VarCurr,bitIndex111)<->v1181(VarCurr,bitIndex373))& (v2442(VarCurr,bitIndex110)<->v1181(VarCurr,bitIndex372))& (v2442(VarCurr,bitIndex109)<->v1181(VarCurr,bitIndex371))& (v2442(VarCurr,bitIndex108)<->v1181(VarCurr,bitIndex370))& (v2442(VarCurr,bitIndex107)<->v1181(VarCurr,bitIndex369))& (v2442(VarCurr,bitIndex106)<->v1181(VarCurr,bitIndex368))& (v2442(VarCurr,bitIndex105)<->v1181(VarCurr,bitIndex367))& (v2442(VarCurr,bitIndex104)<->v1181(VarCurr,bitIndex366))& (v2442(VarCurr,bitIndex103)<->v1181(VarCurr,bitIndex365))& (v2442(VarCurr,bitIndex102)<->v1181(VarCurr,bitIndex364))& (v2442(VarCurr,bitIndex101)<->v1181(VarCurr,bitIndex363))& (v2442(VarCurr,bitIndex100)<->v1181(VarCurr,bitIndex362))& (v2442(VarCurr,bitIndex99)<->v1181(VarCurr,bitIndex361))& (v2442(VarCurr,bitIndex98)<->v1181(VarCurr,bitIndex360))& (v2442(VarCurr,bitIndex97)<->v1181(VarCurr,bitIndex359))& (v2442(VarCurr,bitIndex96)<->v1181(VarCurr,bitIndex358))& (v2442(VarCurr,bitIndex95)<->v1181(VarCurr,bitIndex357))& (v2442(VarCurr,bitIndex94)<->v1181(VarCurr,bitIndex356))& (v2442(VarCurr,bitIndex93)<->v1181(VarCurr,bitIndex355))& (v2442(VarCurr,bitIndex92)<->v1181(VarCurr,bitIndex354))& (v2442(VarCurr,bitIndex91)<->v1181(VarCurr,bitIndex353))& (v2442(VarCurr,bitIndex90)<->v1181(VarCurr,bitIndex352))& (v2442(VarCurr,bitIndex89)<->v1181(VarCurr,bitIndex351))& (v2442(VarCurr,bitIndex88)<->v1181(VarCurr,bitIndex350))& (v2442(VarCurr,bitIndex87)<->v1181(VarCurr,bitIndex349))& (v2442(VarCurr,bitIndex86)<->v1181(VarCurr,bitIndex348))& (v2442(VarCurr,bitIndex85)<->v1181(VarCurr,bitIndex347))& (v2442(VarCurr,bitIndex84)<->v1181(VarCurr,bitIndex346))& (v2442(VarCurr,bitIndex83)<->v1181(VarCurr,bitIndex345))& (v2442(VarCurr,bitIndex82)<->v1181(VarCurr,bitIndex344))& (v2442(VarCurr,bitIndex81)<->v1181(VarCurr,bitIndex343))& (v2442(VarCurr,bitIndex80)<->v1181(VarCurr,bitIndex342))& (v2442(VarCurr,bitIndex79)<->v1181(VarCurr,bitIndex341))& (v2442(VarCurr,bitIndex78)<->v1181(VarCurr,bitIndex340))& (v2442(VarCurr,bitIndex77)<->v1181(VarCurr,bitIndex339))& (v2442(VarCurr,bitIndex76)<->v1181(VarCurr,bitIndex338))& (v2442(VarCurr,bitIndex75)<->v1181(VarCurr,bitIndex337))& (v2442(VarCurr,bitIndex74)<->v1181(VarCurr,bitIndex336))& (v2442(VarCurr,bitIndex73)<->v1181(VarCurr,bitIndex335))& (v2442(VarCurr,bitIndex72)<->v1181(VarCurr,bitIndex334))& (v2442(VarCurr,bitIndex71)<->v1181(VarCurr,bitIndex333))& (v2442(VarCurr,bitIndex70)<->v1181(VarCurr,bitIndex332))& (v2442(VarCurr,bitIndex69)<->v1181(VarCurr,bitIndex331))& (v2442(VarCurr,bitIndex68)<->v1181(VarCurr,bitIndex330))& (v2442(VarCurr,bitIndex67)<->v1181(VarCurr,bitIndex329))& (v2442(VarCurr,bitIndex66)<->v1181(VarCurr,bitIndex328))& (v2442(VarCurr,bitIndex65)<->v1181(VarCurr,bitIndex327))& (v2442(VarCurr,bitIndex64)<->v1181(VarCurr,bitIndex326))& (v2442(VarCurr,bitIndex63)<->v1181(VarCurr,bitIndex325))& (v2442(VarCurr,bitIndex62)<->v1181(VarCurr,bitIndex324))& (v2442(VarCurr,bitIndex61)<->v1181(VarCurr,bitIndex323))& (v2442(VarCurr,bitIndex60)<->v1181(VarCurr,bitIndex322))& (v2442(VarCurr,bitIndex59)<->v1181(VarCurr,bitIndex321))& (v2442(VarCurr,bitIndex58)<->v1181(VarCurr,bitIndex320))& (v2442(VarCurr,bitIndex57)<->v1181(VarCurr,bitIndex319))& (v2442(VarCurr,bitIndex56)<->v1181(VarCurr,bitIndex318))& (v2442(VarCurr,bitIndex55)<->v1181(VarCurr,bitIndex317))& (v2442(VarCurr,bitIndex54)<->v1181(VarCurr,bitIndex316))& (v2442(VarCurr,bitIndex53)<->v1181(VarCurr,bitIndex315))& (v2442(VarCurr,bitIndex52)<->v1181(VarCurr,bitIndex314))& (v2442(VarCurr,bitIndex51)<->v1181(VarCurr,bitIndex313))& (v2442(VarCurr,bitIndex50)<->v1181(VarCurr,bitIndex312))& (v2442(VarCurr,bitIndex49)<->v1181(VarCurr,bitIndex311))& (v2442(VarCurr,bitIndex48)<->v1181(VarCurr,bitIndex310))& (v2442(VarCurr,bitIndex47)<->v1181(VarCurr,bitIndex309))& (v2442(VarCurr,bitIndex46)<->v1181(VarCurr,bitIndex308))& (v2442(VarCurr,bitIndex45)<->v1181(VarCurr,bitIndex307))& (v2442(VarCurr,bitIndex44)<->v1181(VarCurr,bitIndex306))& (v2442(VarCurr,bitIndex43)<->v1181(VarCurr,bitIndex305))& (v2442(VarCurr,bitIndex42)<->v1181(VarCurr,bitIndex304))& (v2442(VarCurr,bitIndex41)<->v1181(VarCurr,bitIndex303))& (v2442(VarCurr,bitIndex40)<->v1181(VarCurr,bitIndex302))& (v2442(VarCurr,bitIndex39)<->v1181(VarCurr,bitIndex301))& (v2442(VarCurr,bitIndex38)<->v1181(VarCurr,bitIndex300))& (v2442(VarCurr,bitIndex37)<->v1181(VarCurr,bitIndex299))& (v2442(VarCurr,bitIndex36)<->v1181(VarCurr,bitIndex298))& (v2442(VarCurr,bitIndex35)<->v1181(VarCurr,bitIndex297))& (v2442(VarCurr,bitIndex34)<->v1181(VarCurr,bitIndex296))& (v2442(VarCurr,bitIndex33)<->v1181(VarCurr,bitIndex295))& (v2442(VarCurr,bitIndex32)<->v1181(VarCurr,bitIndex294))& (v2442(VarCurr,bitIndex31)<->v1181(VarCurr,bitIndex293))& (v2442(VarCurr,bitIndex30)<->v1181(VarCurr,bitIndex292))& (v2442(VarCurr,bitIndex29)<->v1181(VarCurr,bitIndex291))& (v2442(VarCurr,bitIndex28)<->v1181(VarCurr,bitIndex290))& (v2442(VarCurr,bitIndex27)<->v1181(VarCurr,bitIndex289))& (v2442(VarCurr,bitIndex26)<->v1181(VarCurr,bitIndex288))& (v2442(VarCurr,bitIndex25)<->v1181(VarCurr,bitIndex287))& (v2442(VarCurr,bitIndex24)<->v1181(VarCurr,bitIndex286))& (v2442(VarCurr,bitIndex23)<->v1181(VarCurr,bitIndex285))& (v2442(VarCurr,bitIndex22)<->v1181(VarCurr,bitIndex284))& (v2442(VarCurr,bitIndex21)<->v1181(VarCurr,bitIndex283))& (v2442(VarCurr,bitIndex20)<->v1181(VarCurr,bitIndex282))& (v2442(VarCurr,bitIndex19)<->v1181(VarCurr,bitIndex281))& (v2442(VarCurr,bitIndex18)<->v1181(VarCurr,bitIndex280))& (v2442(VarCurr,bitIndex17)<->v1181(VarCurr,bitIndex279))& (v2442(VarCurr,bitIndex16)<->v1181(VarCurr,bitIndex278))& (v2442(VarCurr,bitIndex15)<->v1181(VarCurr,bitIndex277))& (v2442(VarCurr,bitIndex14)<->v1181(VarCurr,bitIndex276))& (v2442(VarCurr,bitIndex13)<->v1181(VarCurr,bitIndex275))& (v2442(VarCurr,bitIndex12)<->v1181(VarCurr,bitIndex274))& (v2442(VarCurr,bitIndex11)<->v1181(VarCurr,bitIndex273))& (v2442(VarCurr,bitIndex10)<->v1181(VarCurr,bitIndex272))& (v2442(VarCurr,bitIndex9)<->v1181(VarCurr,bitIndex271))& (v2442(VarCurr,bitIndex8)<->v1181(VarCurr,bitIndex270))& (v2442(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex269))& (v2442(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex268))& (v2442(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex267))& (v2442(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex266))& (v2442(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex265))& (v2442(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex264))& (v2442(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex263))& (v2442(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex262))).
% 121.63/120.61  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2415(VarNext)<->v2416(VarNext)&v2423(VarNext))).
% 121.63/120.61  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2423(VarNext)<->v2421(VarCurr))).
% 121.63/120.61  all VarCurr (v2421(VarCurr)<->v2424(VarCurr)&v2435(VarCurr)).
% 121.63/120.61  all VarCurr (v2435(VarCurr)<->v2436(VarCurr)|v2378(VarCurr)).
% 121.63/120.61  all VarCurr (-v2436(VarCurr)<->v2437(VarCurr)).
% 121.63/120.61  all VarCurr (v2437(VarCurr)<-> (v2438(VarCurr,bitIndex1)<->$F)& (v2438(VarCurr,bitIndex0)<->$F)).
% 121.63/120.61  all VarCurr (v2438(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.61  all VarCurr (v2438(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.61  all VarCurr (v2424(VarCurr)<->v2378(VarCurr)|v2425(VarCurr)).
% 121.63/120.61  all VarCurr (v2425(VarCurr)<->v2426(VarCurr)&v2434(VarCurr)).
% 121.63/120.61  all VarCurr (-v2434(VarCurr)<->v2378(VarCurr)).
% 121.63/120.61  all VarCurr (v2426(VarCurr)<->v2427(VarCurr)|v2432(VarCurr)).
% 121.63/120.61  all VarCurr (v2432(VarCurr)<-> (v2433(VarCurr,bitIndex1)<->$T)& (v2433(VarCurr,bitIndex0)<->$T)).
% 121.63/120.61  all VarCurr (v2433(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.61  all VarCurr (v2433(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.61  all VarCurr (v2427(VarCurr)<->v2428(VarCurr)|v2430(VarCurr)).
% 121.63/120.61  all VarCurr (v2430(VarCurr)<-> (v2431(VarCurr,bitIndex1)<->$T)& (v2431(VarCurr,bitIndex0)<->$F)).
% 121.63/120.61  all VarCurr (v2431(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.61  all VarCurr (v2431(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.61  all VarCurr (v2428(VarCurr)<-> (v2429(VarCurr,bitIndex1)<->$F)& (v2429(VarCurr,bitIndex0)<->$T)).
% 121.63/120.61  all VarCurr (v2429(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.61  all VarCurr (v2429(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.61  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2416(VarNext)<->v2418(VarNext)&v2077(VarNext))).
% 121.63/120.61  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2418(VarNext)<->v2086(VarNext))).
% 121.63/120.61  all VarCurr B (range_7_0(B)-> (v2407(VarCurr,B)<->v2412(VarCurr,B))).
% 121.63/120.61  all VarCurr (-v2409(VarCurr)-> (all B (range_130_0(B)-> (v2412(VarCurr,B)<->v2411(VarCurr,B))))).
% 121.63/120.61  all VarCurr (v2409(VarCurr)-> (all B (range_130_0(B)-> (v2412(VarCurr,B)<->v2131(VarCurr,B))))).
% 121.63/120.61  all VarCurr ((v2411(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex269))& (v2411(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex268))& (v2411(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex267))& (v2411(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex266))& (v2411(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex265))& (v2411(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex264))& (v2411(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex263))& (v2411(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex262))).
% 121.63/120.61  all VarCurr (v2409(VarCurr)<->v2070(VarCurr,bitIndex1)).
% 121.63/120.61  all VarCurr B (range_7_0(B)-> (v2400(VarCurr,B)<->v2405(VarCurr,B))).
% 121.63/120.61  all VarCurr (-v2402(VarCurr)-> (all B (range_130_0(B)-> (v2405(VarCurr,B)<->v2404(VarCurr,B))))).
% 121.63/120.61  all VarCurr (v2402(VarCurr)-> (all B (range_130_0(B)-> (v2405(VarCurr,B)<->v2131(VarCurr,B))))).
% 121.63/120.61  all VarCurr ((v2404(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex400))& (v2404(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex399))& (v2404(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex398))& (v2404(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex397))& (v2404(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex396))& (v2404(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex395))& (v2404(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex394))& (v2404(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex393))).
% 121.63/120.61  all VarCurr (v2402(VarCurr)<->v2070(VarCurr,bitIndex1)).
% 121.63/120.61  all VarNext ((v1181(VarNext,bitIndex269)<->v2367(VarNext,bitIndex7))& (v1181(VarNext,bitIndex268)<->v2367(VarNext,bitIndex6))& (v1181(VarNext,bitIndex267)<->v2367(VarNext,bitIndex5))& (v1181(VarNext,bitIndex266)<->v2367(VarNext,bitIndex4))& (v1181(VarNext,bitIndex265)<->v2367(VarNext,bitIndex3))& (v1181(VarNext,bitIndex264)<->v2367(VarNext,bitIndex2))& (v1181(VarNext,bitIndex263)<->v2367(VarNext,bitIndex1))& (v1181(VarNext,bitIndex262)<->v2367(VarNext,bitIndex0))).
% 121.63/120.61  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2368(VarNext)-> (v2367(VarNext,bitIndex130)<->v1181(VarCurr,bitIndex392))& (v2367(VarNext,bitIndex129)<->v1181(VarCurr,bitIndex391))& (v2367(VarNext,bitIndex128)<->v1181(VarCurr,bitIndex390))& (v2367(VarNext,bitIndex127)<->v1181(VarCurr,bitIndex389))& (v2367(VarNext,bitIndex126)<->v1181(VarCurr,bitIndex388))& (v2367(VarNext,bitIndex125)<->v1181(VarCurr,bitIndex387))& (v2367(VarNext,bitIndex124)<->v1181(VarCurr,bitIndex386))& (v2367(VarNext,bitIndex123)<->v1181(VarCurr,bitIndex385))& (v2367(VarNext,bitIndex122)<->v1181(VarCurr,bitIndex384))& (v2367(VarNext,bitIndex121)<->v1181(VarCurr,bitIndex383))& (v2367(VarNext,bitIndex120)<->v1181(VarCurr,bitIndex382))& (v2367(VarNext,bitIndex119)<->v1181(VarCurr,bitIndex381))& (v2367(VarNext,bitIndex118)<->v1181(VarCurr,bitIndex380))& (v2367(VarNext,bitIndex117)<->v1181(VarCurr,bitIndex379))& (v2367(VarNext,bitIndex116)<->v1181(VarCurr,bitIndex378))& (v2367(VarNext,bitIndex115)<->v1181(VarCurr,bitIndex377))& (v2367(VarNext,bitIndex114)<->v1181(VarCurr,bitIndex376))& (v2367(VarNext,bitIndex113)<->v1181(VarCurr,bitIndex375))& (v2367(VarNext,bitIndex112)<->v1181(VarCurr,bitIndex374))& (v2367(VarNext,bitIndex111)<->v1181(VarCurr,bitIndex373))& (v2367(VarNext,bitIndex110)<->v1181(VarCurr,bitIndex372))& (v2367(VarNext,bitIndex109)<->v1181(VarCurr,bitIndex371))& (v2367(VarNext,bitIndex108)<->v1181(VarCurr,bitIndex370))& (v2367(VarNext,bitIndex107)<->v1181(VarCurr,bitIndex369))& (v2367(VarNext,bitIndex106)<->v1181(VarCurr,bitIndex368))& (v2367(VarNext,bitIndex105)<->v1181(VarCurr,bitIndex367))& (v2367(VarNext,bitIndex104)<->v1181(VarCurr,bitIndex366))& (v2367(VarNext,bitIndex103)<->v1181(VarCurr,bitIndex365))& (v2367(VarNext,bitIndex102)<->v1181(VarCurr,bitIndex364))& (v2367(VarNext,bitIndex101)<->v1181(VarCurr,bitIndex363))& (v2367(VarNext,bitIndex100)<->v1181(VarCurr,bitIndex362))& (v2367(VarNext,bitIndex99)<->v1181(VarCurr,bitIndex361))& (v2367(VarNext,bitIndex98)<->v1181(VarCurr,bitIndex360))& (v2367(VarNext,bitIndex97)<->v1181(VarCurr,bitIndex359))& (v2367(VarNext,bitIndex96)<->v1181(VarCurr,bitIndex358))& (v2367(VarNext,bitIndex95)<->v1181(VarCurr,bitIndex357))& (v2367(VarNext,bitIndex94)<->v1181(VarCurr,bitIndex356))& (v2367(VarNext,bitIndex93)<->v1181(VarCurr,bitIndex355))& (v2367(VarNext,bitIndex92)<->v1181(VarCurr,bitIndex354))& (v2367(VarNext,bitIndex91)<->v1181(VarCurr,bitIndex353))& (v2367(VarNext,bitIndex90)<->v1181(VarCurr,bitIndex352))& (v2367(VarNext,bitIndex89)<->v1181(VarCurr,bitIndex351))& (v2367(VarNext,bitIndex88)<->v1181(VarCurr,bitIndex350))& (v2367(VarNext,bitIndex87)<->v1181(VarCurr,bitIndex349))& (v2367(VarNext,bitIndex86)<->v1181(VarCurr,bitIndex348))& (v2367(VarNext,bitIndex85)<->v1181(VarCurr,bitIndex347))& (v2367(VarNext,bitIndex84)<->v1181(VarCurr,bitIndex346))& (v2367(VarNext,bitIndex83)<->v1181(VarCurr,bitIndex345))& (v2367(VarNext,bitIndex82)<->v1181(VarCurr,bitIndex344))& (v2367(VarNext,bitIndex81)<->v1181(VarCurr,bitIndex343))& (v2367(VarNext,bitIndex80)<->v1181(VarCurr,bitIndex342))& (v2367(VarNext,bitIndex79)<->v1181(VarCurr,bitIndex341))& (v2367(VarNext,bitIndex78)<->v1181(VarCurr,bitIndex340))& (v2367(VarNext,bitIndex77)<->v1181(VarCurr,bitIndex339))& (v2367(VarNext,bitIndex76)<->v1181(VarCurr,bitIndex338))& (v2367(VarNext,bitIndex75)<->v1181(VarCurr,bitIndex337))& (v2367(VarNext,bitIndex74)<->v1181(VarCurr,bitIndex336))& (v2367(VarNext,bitIndex73)<->v1181(VarCurr,bitIndex335))& (v2367(VarNext,bitIndex72)<->v1181(VarCurr,bitIndex334))& (v2367(VarNext,bitIndex71)<->v1181(VarCurr,bitIndex333))& (v2367(VarNext,bitIndex70)<->v1181(VarCurr,bitIndex332))& (v2367(VarNext,bitIndex69)<->v1181(VarCurr,bitIndex331))& (v2367(VarNext,bitIndex68)<->v1181(VarCurr,bitIndex330))& (v2367(VarNext,bitIndex67)<->v1181(VarCurr,bitIndex329))& (v2367(VarNext,bitIndex66)<->v1181(VarCurr,bitIndex328))& (v2367(VarNext,bitIndex65)<->v1181(VarCurr,bitIndex327))& (v2367(VarNext,bitIndex64)<->v1181(VarCurr,bitIndex326))& (v2367(VarNext,bitIndex63)<->v1181(VarCurr,bitIndex325))& (v2367(VarNext,bitIndex62)<->v1181(VarCurr,bitIndex324))& (v2367(VarNext,bitIndex61)<->v1181(VarCurr,bitIndex323))& (v2367(VarNext,bitIndex60)<->v1181(VarCurr,bitIndex322))& (v2367(VarNext,bitIndex59)<->v1181(VarCurr,bitIndex321))& (v2367(VarNext,bitIndex58)<->v1181(VarCurr,bitIndex320))& (v2367(VarNext,bitIndex57)<->v1181(VarCurr,bitIndex319))& (v2367(VarNext,bitIndex56)<->v1181(VarCurr,bitIndex318))& (v2367(VarNext,bitIndex55)<->v1181(VarCurr,bitIndex317))& (v2367(VarNext,bitIndex54)<->v1181(VarCurr,bitIndex316))& (v2367(VarNext,bitIndex53)<->v1181(VarCurr,bitIndex315))& (v2367(VarNext,bitIndex52)<->v1181(VarCurr,bitIndex314))& (v2367(VarNext,bitIndex51)<->v1181(VarCurr,bitIndex313))& (v2367(VarNext,bitIndex50)<->v1181(VarCurr,bitIndex312))& (v2367(VarNext,bitIndex49)<->v1181(VarCurr,bitIndex311))& (v2367(VarNext,bitIndex48)<->v1181(VarCurr,bitIndex310))& (v2367(VarNext,bitIndex47)<->v1181(VarCurr,bitIndex309))& (v2367(VarNext,bitIndex46)<->v1181(VarCurr,bitIndex308))& (v2367(VarNext,bitIndex45)<->v1181(VarCurr,bitIndex307))& (v2367(VarNext,bitIndex44)<->v1181(VarCurr,bitIndex306))& (v2367(VarNext,bitIndex43)<->v1181(VarCurr,bitIndex305))& (v2367(VarNext,bitIndex42)<->v1181(VarCurr,bitIndex304))& (v2367(VarNext,bitIndex41)<->v1181(VarCurr,bitIndex303))& (v2367(VarNext,bitIndex40)<->v1181(VarCurr,bitIndex302))& (v2367(VarNext,bitIndex39)<->v1181(VarCurr,bitIndex301))& (v2367(VarNext,bitIndex38)<->v1181(VarCurr,bitIndex300))& (v2367(VarNext,bitIndex37)<->v1181(VarCurr,bitIndex299))& (v2367(VarNext,bitIndex36)<->v1181(VarCurr,bitIndex298))& (v2367(VarNext,bitIndex35)<->v1181(VarCurr,bitIndex297))& (v2367(VarNext,bitIndex34)<->v1181(VarCurr,bitIndex296))& (v2367(VarNext,bitIndex33)<->v1181(VarCurr,bitIndex295))& (v2367(VarNext,bitIndex32)<->v1181(VarCurr,bitIndex294))& (v2367(VarNext,bitIndex31)<->v1181(VarCurr,bitIndex293))& (v2367(VarNext,bitIndex30)<->v1181(VarCurr,bitIndex292))& (v2367(VarNext,bitIndex29)<->v1181(VarCurr,bitIndex291))& (v2367(VarNext,bitIndex28)<->v1181(VarCurr,bitIndex290))& (v2367(VarNext,bitIndex27)<->v1181(VarCurr,bitIndex289))& (v2367(VarNext,bitIndex26)<->v1181(VarCurr,bitIndex288))& (v2367(VarNext,bitIndex25)<->v1181(VarCurr,bitIndex287))& (v2367(VarNext,bitIndex24)<->v1181(VarCurr,bitIndex286))& (v2367(VarNext,bitIndex23)<->v1181(VarCurr,bitIndex285))& (v2367(VarNext,bitIndex22)<->v1181(VarCurr,bitIndex284))& (v2367(VarNext,bitIndex21)<->v1181(VarCurr,bitIndex283))& (v2367(VarNext,bitIndex20)<->v1181(VarCurr,bitIndex282))& (v2367(VarNext,bitIndex19)<->v1181(VarCurr,bitIndex281))& (v2367(VarNext,bitIndex18)<->v1181(VarCurr,bitIndex280))& (v2367(VarNext,bitIndex17)<->v1181(VarCurr,bitIndex279))& (v2367(VarNext,bitIndex16)<->v1181(VarCurr,bitIndex278))& (v2367(VarNext,bitIndex15)<->v1181(VarCurr,bitIndex277))& (v2367(VarNext,bitIndex14)<->v1181(VarCurr,bitIndex276))& (v2367(VarNext,bitIndex13)<->v1181(VarCurr,bitIndex275))& (v2367(VarNext,bitIndex12)<->v1181(VarCurr,bitIndex274))& (v2367(VarNext,bitIndex11)<->v1181(VarCurr,bitIndex273))& (v2367(VarNext,bitIndex10)<->v1181(VarCurr,bitIndex272))& (v2367(VarNext,bitIndex9)<->v1181(VarCurr,bitIndex271))& (v2367(VarNext,bitIndex8)<->v1181(VarCurr,bitIndex270))& (v2367(VarNext,bitIndex7)<->v1181(VarCurr,bitIndex269))& (v2367(VarNext,bitIndex6)<->v1181(VarCurr,bitIndex268))& (v2367(VarNext,bitIndex5)<->v1181(VarCurr,bitIndex267))& (v2367(VarNext,bitIndex4)<->v1181(VarCurr,bitIndex266))& (v2367(VarNext,bitIndex3)<->v1181(VarCurr,bitIndex265))& (v2367(VarNext,bitIndex2)<->v1181(VarCurr,bitIndex264))& (v2367(VarNext,bitIndex1)<->v1181(VarCurr,bitIndex263))& (v2367(VarNext,bitIndex0)<->v1181(VarCurr,bitIndex262)))).
% 121.63/120.62  all VarNext (v2368(VarNext)-> (all B (range_130_0(B)-> (v2367(VarNext,B)<->v2395(VarNext,B))))).
% 121.63/120.62  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v2395(VarNext,B)<->v2393(VarCurr,B))))).
% 121.63/120.62  all VarCurr (-v2378(VarCurr)-> (all B (range_130_0(B)-> (v2393(VarCurr,B)<->v2396(VarCurr,B))))).
% 121.63/120.62  all VarCurr (v2378(VarCurr)-> (all B (range_130_0(B)-> (v2393(VarCurr,B)<->$F)))).
% 121.63/120.62  all VarCurr (-v2382(VarCurr)& -v2384(VarCurr)-> (all B (range_130_0(B)-> (v2396(VarCurr,B)<->v2360(VarCurr,B))))).
% 121.63/120.62  all VarCurr (v2384(VarCurr)-> (all B (range_130_0(B)-> (v2396(VarCurr,B)<->v2066(VarCurr,B))))).
% 121.63/120.62  all VarCurr (v2382(VarCurr)-> (v2396(VarCurr,bitIndex130)<->v1181(VarCurr,bitIndex261))& (v2396(VarCurr,bitIndex129)<->v1181(VarCurr,bitIndex260))& (v2396(VarCurr,bitIndex128)<->v1181(VarCurr,bitIndex259))& (v2396(VarCurr,bitIndex127)<->v1181(VarCurr,bitIndex258))& (v2396(VarCurr,bitIndex126)<->v1181(VarCurr,bitIndex257))& (v2396(VarCurr,bitIndex125)<->v1181(VarCurr,bitIndex256))& (v2396(VarCurr,bitIndex124)<->v1181(VarCurr,bitIndex255))& (v2396(VarCurr,bitIndex123)<->v1181(VarCurr,bitIndex254))& (v2396(VarCurr,bitIndex122)<->v1181(VarCurr,bitIndex253))& (v2396(VarCurr,bitIndex121)<->v1181(VarCurr,bitIndex252))& (v2396(VarCurr,bitIndex120)<->v1181(VarCurr,bitIndex251))& (v2396(VarCurr,bitIndex119)<->v1181(VarCurr,bitIndex250))& (v2396(VarCurr,bitIndex118)<->v1181(VarCurr,bitIndex249))& (v2396(VarCurr,bitIndex117)<->v1181(VarCurr,bitIndex248))& (v2396(VarCurr,bitIndex116)<->v1181(VarCurr,bitIndex247))& (v2396(VarCurr,bitIndex115)<->v1181(VarCurr,bitIndex246))& (v2396(VarCurr,bitIndex114)<->v1181(VarCurr,bitIndex245))& (v2396(VarCurr,bitIndex113)<->v1181(VarCurr,bitIndex244))& (v2396(VarCurr,bitIndex112)<->v1181(VarCurr,bitIndex243))& (v2396(VarCurr,bitIndex111)<->v1181(VarCurr,bitIndex242))& (v2396(VarCurr,bitIndex110)<->v1181(VarCurr,bitIndex241))& (v2396(VarCurr,bitIndex109)<->v1181(VarCurr,bitIndex240))& (v2396(VarCurr,bitIndex108)<->v1181(VarCurr,bitIndex239))& (v2396(VarCurr,bitIndex107)<->v1181(VarCurr,bitIndex238))& (v2396(VarCurr,bitIndex106)<->v1181(VarCurr,bitIndex237))& (v2396(VarCurr,bitIndex105)<->v1181(VarCurr,bitIndex236))& (v2396(VarCurr,bitIndex104)<->v1181(VarCurr,bitIndex235))& (v2396(VarCurr,bitIndex103)<->v1181(VarCurr,bitIndex234))& (v2396(VarCurr,bitIndex102)<->v1181(VarCurr,bitIndex233))& (v2396(VarCurr,bitIndex101)<->v1181(VarCurr,bitIndex232))& (v2396(VarCurr,bitIndex100)<->v1181(VarCurr,bitIndex231))& (v2396(VarCurr,bitIndex99)<->v1181(VarCurr,bitIndex230))& (v2396(VarCurr,bitIndex98)<->v1181(VarCurr,bitIndex229))& (v2396(VarCurr,bitIndex97)<->v1181(VarCurr,bitIndex228))& (v2396(VarCurr,bitIndex96)<->v1181(VarCurr,bitIndex227))& (v2396(VarCurr,bitIndex95)<->v1181(VarCurr,bitIndex226))& (v2396(VarCurr,bitIndex94)<->v1181(VarCurr,bitIndex225))& (v2396(VarCurr,bitIndex93)<->v1181(VarCurr,bitIndex224))& (v2396(VarCurr,bitIndex92)<->v1181(VarCurr,bitIndex223))& (v2396(VarCurr,bitIndex91)<->v1181(VarCurr,bitIndex222))& (v2396(VarCurr,bitIndex90)<->v1181(VarCurr,bitIndex221))& (v2396(VarCurr,bitIndex89)<->v1181(VarCurr,bitIndex220))& (v2396(VarCurr,bitIndex88)<->v1181(VarCurr,bitIndex219))& (v2396(VarCurr,bitIndex87)<->v1181(VarCurr,bitIndex218))& (v2396(VarCurr,bitIndex86)<->v1181(VarCurr,bitIndex217))& (v2396(VarCurr,bitIndex85)<->v1181(VarCurr,bitIndex216))& (v2396(VarCurr,bitIndex84)<->v1181(VarCurr,bitIndex215))& (v2396(VarCurr,bitIndex83)<->v1181(VarCurr,bitIndex214))& (v2396(VarCurr,bitIndex82)<->v1181(VarCurr,bitIndex213))& (v2396(VarCurr,bitIndex81)<->v1181(VarCurr,bitIndex212))& (v2396(VarCurr,bitIndex80)<->v1181(VarCurr,bitIndex211))& (v2396(VarCurr,bitIndex79)<->v1181(VarCurr,bitIndex210))& (v2396(VarCurr,bitIndex78)<->v1181(VarCurr,bitIndex209))& (v2396(VarCurr,bitIndex77)<->v1181(VarCurr,bitIndex208))& (v2396(VarCurr,bitIndex76)<->v1181(VarCurr,bitIndex207))& (v2396(VarCurr,bitIndex75)<->v1181(VarCurr,bitIndex206))& (v2396(VarCurr,bitIndex74)<->v1181(VarCurr,bitIndex205))& (v2396(VarCurr,bitIndex73)<->v1181(VarCurr,bitIndex204))& (v2396(VarCurr,bitIndex72)<->v1181(VarCurr,bitIndex203))& (v2396(VarCurr,bitIndex71)<->v1181(VarCurr,bitIndex202))& (v2396(VarCurr,bitIndex70)<->v1181(VarCurr,bitIndex201))& (v2396(VarCurr,bitIndex69)<->v1181(VarCurr,bitIndex200))& (v2396(VarCurr,bitIndex68)<->v1181(VarCurr,bitIndex199))& (v2396(VarCurr,bitIndex67)<->v1181(VarCurr,bitIndex198))& (v2396(VarCurr,bitIndex66)<->v1181(VarCurr,bitIndex197))& (v2396(VarCurr,bitIndex65)<->v1181(VarCurr,bitIndex196))& (v2396(VarCurr,bitIndex64)<->v1181(VarCurr,bitIndex195))& (v2396(VarCurr,bitIndex63)<->v1181(VarCurr,bitIndex194))& (v2396(VarCurr,bitIndex62)<->v1181(VarCurr,bitIndex193))& (v2396(VarCurr,bitIndex61)<->v1181(VarCurr,bitIndex192))& (v2396(VarCurr,bitIndex60)<->v1181(VarCurr,bitIndex191))& (v2396(VarCurr,bitIndex59)<->v1181(VarCurr,bitIndex190))& (v2396(VarCurr,bitIndex58)<->v1181(VarCurr,bitIndex189))& (v2396(VarCurr,bitIndex57)<->v1181(VarCurr,bitIndex188))& (v2396(VarCurr,bitIndex56)<->v1181(VarCurr,bitIndex187))& (v2396(VarCurr,bitIndex55)<->v1181(VarCurr,bitIndex186))& (v2396(VarCurr,bitIndex54)<->v1181(VarCurr,bitIndex185))& (v2396(VarCurr,bitIndex53)<->v1181(VarCurr,bitIndex184))& (v2396(VarCurr,bitIndex52)<->v1181(VarCurr,bitIndex183))& (v2396(VarCurr,bitIndex51)<->v1181(VarCurr,bitIndex182))& (v2396(VarCurr,bitIndex50)<->v1181(VarCurr,bitIndex181))& (v2396(VarCurr,bitIndex49)<->v1181(VarCurr,bitIndex180))& (v2396(VarCurr,bitIndex48)<->v1181(VarCurr,bitIndex179))& (v2396(VarCurr,bitIndex47)<->v1181(VarCurr,bitIndex178))& (v2396(VarCurr,bitIndex46)<->v1181(VarCurr,bitIndex177))& (v2396(VarCurr,bitIndex45)<->v1181(VarCurr,bitIndex176))& (v2396(VarCurr,bitIndex44)<->v1181(VarCurr,bitIndex175))& (v2396(VarCurr,bitIndex43)<->v1181(VarCurr,bitIndex174))& (v2396(VarCurr,bitIndex42)<->v1181(VarCurr,bitIndex173))& (v2396(VarCurr,bitIndex41)<->v1181(VarCurr,bitIndex172))& (v2396(VarCurr,bitIndex40)<->v1181(VarCurr,bitIndex171))& (v2396(VarCurr,bitIndex39)<->v1181(VarCurr,bitIndex170))& (v2396(VarCurr,bitIndex38)<->v1181(VarCurr,bitIndex169))& (v2396(VarCurr,bitIndex37)<->v1181(VarCurr,bitIndex168))& (v2396(VarCurr,bitIndex36)<->v1181(VarCurr,bitIndex167))& (v2396(VarCurr,bitIndex35)<->v1181(VarCurr,bitIndex166))& (v2396(VarCurr,bitIndex34)<->v1181(VarCurr,bitIndex165))& (v2396(VarCurr,bitIndex33)<->v1181(VarCurr,bitIndex164))& (v2396(VarCurr,bitIndex32)<->v1181(VarCurr,bitIndex163))& (v2396(VarCurr,bitIndex31)<->v1181(VarCurr,bitIndex162))& (v2396(VarCurr,bitIndex30)<->v1181(VarCurr,bitIndex161))& (v2396(VarCurr,bitIndex29)<->v1181(VarCurr,bitIndex160))& (v2396(VarCurr,bitIndex28)<->v1181(VarCurr,bitIndex159))& (v2396(VarCurr,bitIndex27)<->v1181(VarCurr,bitIndex158))& (v2396(VarCurr,bitIndex26)<->v1181(VarCurr,bitIndex157))& (v2396(VarCurr,bitIndex25)<->v1181(VarCurr,bitIndex156))& (v2396(VarCurr,bitIndex24)<->v1181(VarCurr,bitIndex155))& (v2396(VarCurr,bitIndex23)<->v1181(VarCurr,bitIndex154))& (v2396(VarCurr,bitIndex22)<->v1181(VarCurr,bitIndex153))& (v2396(VarCurr,bitIndex21)<->v1181(VarCurr,bitIndex152))& (v2396(VarCurr,bitIndex20)<->v1181(VarCurr,bitIndex151))& (v2396(VarCurr,bitIndex19)<->v1181(VarCurr,bitIndex150))& (v2396(VarCurr,bitIndex18)<->v1181(VarCurr,bitIndex149))& (v2396(VarCurr,bitIndex17)<->v1181(VarCurr,bitIndex148))& (v2396(VarCurr,bitIndex16)<->v1181(VarCurr,bitIndex147))& (v2396(VarCurr,bitIndex15)<->v1181(VarCurr,bitIndex146))& (v2396(VarCurr,bitIndex14)<->v1181(VarCurr,bitIndex145))& (v2396(VarCurr,bitIndex13)<->v1181(VarCurr,bitIndex144))& (v2396(VarCurr,bitIndex12)<->v1181(VarCurr,bitIndex143))& (v2396(VarCurr,bitIndex11)<->v1181(VarCurr,bitIndex142))& (v2396(VarCurr,bitIndex10)<->v1181(VarCurr,bitIndex141))& (v2396(VarCurr,bitIndex9)<->v1181(VarCurr,bitIndex140))& (v2396(VarCurr,bitIndex8)<->v1181(VarCurr,bitIndex139))& (v2396(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex138))& (v2396(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex137))& (v2396(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex136))& (v2396(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex135))& (v2396(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex134))& (v2396(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex133))& (v2396(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex132))& (v2396(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex131))).
% 121.63/120.62  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2368(VarNext)<->v2369(VarNext)&v2376(VarNext))).
% 121.63/120.62  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2376(VarNext)<->v2374(VarCurr))).
% 121.63/120.62  all VarCurr (v2374(VarCurr)<->v2377(VarCurr)&v2389(VarCurr)).
% 121.63/120.62  all VarCurr (v2389(VarCurr)<->v2390(VarCurr)|v2378(VarCurr)).
% 121.63/120.62  all VarCurr (-v2390(VarCurr)<->v2391(VarCurr)).
% 121.63/120.62  all VarCurr (v2391(VarCurr)<-> (v2392(VarCurr,bitIndex1)<->$F)& (v2392(VarCurr,bitIndex0)<->$F)).
% 121.63/120.62  all VarCurr (v2392(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.62  all VarCurr (v2392(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.62  all VarCurr (v2377(VarCurr)<->v2378(VarCurr)|v2379(VarCurr)).
% 121.63/120.62  all VarCurr (v2379(VarCurr)<->v2380(VarCurr)&v2388(VarCurr)).
% 121.63/120.62  all VarCurr (-v2388(VarCurr)<->v2378(VarCurr)).
% 121.63/120.62  all VarCurr (v2380(VarCurr)<->v2381(VarCurr)|v2386(VarCurr)).
% 121.63/120.62  all VarCurr (v2386(VarCurr)<-> (v2387(VarCurr,bitIndex1)<->$T)& (v2387(VarCurr,bitIndex0)<->$T)).
% 121.63/120.62  all VarCurr (v2387(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.62  all VarCurr (v2387(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.62  all VarCurr (v2381(VarCurr)<->v2382(VarCurr)|v2384(VarCurr)).
% 121.63/120.62  all VarCurr (v2384(VarCurr)<-> (v2385(VarCurr,bitIndex1)<->$T)& (v2385(VarCurr,bitIndex0)<->$F)).
% 121.63/120.62  all VarCurr (v2385(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.62  all VarCurr (v2385(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.62  all VarCurr (v2382(VarCurr)<-> (v2383(VarCurr,bitIndex1)<->$F)& (v2383(VarCurr,bitIndex0)<->$T)).
% 121.63/120.62  all VarCurr (v2383(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.62  all VarCurr (v2383(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.62  all VarCurr (-v2378(VarCurr)<->v1183(VarCurr)).
% 121.63/120.62  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2369(VarNext)<->v2370(VarNext)&v2077(VarNext))).
% 121.63/120.62  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2370(VarNext)<->v2086(VarNext))).
% 121.63/120.62  all VarCurr B (range_7_0(B)-> (v2360(VarCurr,B)<->v2365(VarCurr,B))).
% 121.63/120.62  all VarCurr (-v2362(VarCurr)-> (all B (range_130_0(B)-> (v2365(VarCurr,B)<->v2364(VarCurr,B))))).
% 121.63/120.62  all VarCurr (v2362(VarCurr)-> (all B (range_130_0(B)-> (v2365(VarCurr,B)<->v2131(VarCurr,B))))).
% 121.63/120.62  all VarCurr ((v2364(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex138))& (v2364(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex137))& (v2364(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex136))& (v2364(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex135))& (v2364(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex134))& (v2364(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex133))& (v2364(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex132))& (v2364(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex131))).
% 121.63/120.62  all VarCurr (v2362(VarCurr)<->v2070(VarCurr,bitIndex2)).
% 121.63/120.62  all VarCurr B (range_7_0(B)-> (v2066(VarCurr,B)<->v2358(VarCurr,B))).
% 121.63/120.62  all VarCurr (-v2068(VarCurr)-> (all B (range_130_0(B)-> (v2358(VarCurr,B)<->v2351(VarCurr,B))))).
% 121.63/120.62  all VarCurr (v2068(VarCurr)-> (all B (range_130_0(B)-> (v2358(VarCurr,B)<->v2131(VarCurr,B))))).
% 121.63/120.62  all VarCurr ((v2351(VarCurr,bitIndex7)<->v1181(VarCurr,bitIndex269))& (v2351(VarCurr,bitIndex6)<->v1181(VarCurr,bitIndex268))& (v2351(VarCurr,bitIndex5)<->v1181(VarCurr,bitIndex267))& (v2351(VarCurr,bitIndex4)<->v1181(VarCurr,bitIndex266))& (v2351(VarCurr,bitIndex3)<->v1181(VarCurr,bitIndex265))& (v2351(VarCurr,bitIndex2)<->v1181(VarCurr,bitIndex264))& (v2351(VarCurr,bitIndex1)<->v1181(VarCurr,bitIndex263))& (v2351(VarCurr,bitIndex0)<->v1181(VarCurr,bitIndex262))).
% 121.63/120.62  -v1181(constB0,bitIndex400).
% 121.63/120.62  -v1181(constB0,bitIndex399).
% 121.63/120.62  -v1181(constB0,bitIndex398).
% 121.63/120.62  -v1181(constB0,bitIndex397).
% 121.63/120.62  -v1181(constB0,bitIndex396).
% 121.63/120.62  -v1181(constB0,bitIndex395).
% 121.63/120.62  -v1181(constB0,bitIndex394).
% 121.63/120.62  -v1181(constB0,bitIndex393).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.63/120.62  -v1181(constB0,bitIndex269).
% 121.63/120.62  -v1181(constB0,bitIndex268).
% 121.63/120.62  -v1181(constB0,bitIndex267).
% 121.63/120.62  -v1181(constB0,bitIndex266).
% 121.63/120.62  -v1181(constB0,bitIndex265).
% 121.63/120.62  -v1181(constB0,bitIndex264).
% 121.63/120.62  -v1181(constB0,bitIndex263).
% 121.63/120.62  -v1181(constB0,bitIndex262).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.63/120.62  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.63/120.63  -v1181(constB0,bitIndex138).
% 121.63/120.63  -v1181(constB0,bitIndex137).
% 121.63/120.63  -v1181(constB0,bitIndex136).
% 121.63/120.63  -v1181(constB0,bitIndex135).
% 121.63/120.63  -v1181(constB0,bitIndex134).
% 121.63/120.63  -v1181(constB0,bitIndex133).
% 121.63/120.63  -v1181(constB0,bitIndex132).
% 121.63/120.63  -v1181(constB0,bitIndex131).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.63/120.63  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2131(VarCurr,B)<->v2133(VarCurr,B))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2133(VarCurr,B)<->v2135(VarCurr,B))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2135(VarCurr,B)<->v2137(VarCurr,B))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2137(VarCurr,B)<->v2349(VarCurr,B))).
% 121.63/120.63  all VarCurr (-v2139(VarCurr)-> (all B (range_130_0(B)-> (v2349(VarCurr,B)<->v2337(VarCurr,B))))).
% 121.63/120.63  all VarCurr (v2139(VarCurr)-> (all B (range_130_0(B)-> (v2349(VarCurr,B)<->v2143(VarCurr,B))))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2337(VarCurr,B)<->v2338(VarCurr,B))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2338(VarCurr,B)<->v2147(VarCurr,B))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex14)<->v2348(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex13)<->v2348(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex12)<->v2348(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex11)<->v2348(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex10)<->v2348(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex9)<->v2348(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex8)<->v2348(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex76)<->v2347(VarCurr,bitIndex61))& (v2338(VarCurr,bitIndex75)<->v2347(VarCurr,bitIndex60))& (v2338(VarCurr,bitIndex74)<->v2347(VarCurr,bitIndex59))& (v2338(VarCurr,bitIndex73)<->v2347(VarCurr,bitIndex58))& (v2338(VarCurr,bitIndex72)<->v2347(VarCurr,bitIndex57))& (v2338(VarCurr,bitIndex71)<->v2347(VarCurr,bitIndex56))& (v2338(VarCurr,bitIndex70)<->v2347(VarCurr,bitIndex55))& (v2338(VarCurr,bitIndex69)<->v2347(VarCurr,bitIndex54))& (v2338(VarCurr,bitIndex68)<->v2347(VarCurr,bitIndex53))& (v2338(VarCurr,bitIndex67)<->v2347(VarCurr,bitIndex52))& (v2338(VarCurr,bitIndex66)<->v2347(VarCurr,bitIndex51))& (v2338(VarCurr,bitIndex65)<->v2347(VarCurr,bitIndex50))& (v2338(VarCurr,bitIndex64)<->v2347(VarCurr,bitIndex49))& (v2338(VarCurr,bitIndex63)<->v2347(VarCurr,bitIndex48))& (v2338(VarCurr,bitIndex62)<->v2347(VarCurr,bitIndex47))& (v2338(VarCurr,bitIndex61)<->v2347(VarCurr,bitIndex46))& (v2338(VarCurr,bitIndex60)<->v2347(VarCurr,bitIndex45))& (v2338(VarCurr,bitIndex59)<->v2347(VarCurr,bitIndex44))& (v2338(VarCurr,bitIndex58)<->v2347(VarCurr,bitIndex43))& (v2338(VarCurr,bitIndex57)<->v2347(VarCurr,bitIndex42))& (v2338(VarCurr,bitIndex56)<->v2347(VarCurr,bitIndex41))& (v2338(VarCurr,bitIndex55)<->v2347(VarCurr,bitIndex40))& (v2338(VarCurr,bitIndex54)<->v2347(VarCurr,bitIndex39))& (v2338(VarCurr,bitIndex53)<->v2347(VarCurr,bitIndex38))& (v2338(VarCurr,bitIndex52)<->v2347(VarCurr,bitIndex37))& (v2338(VarCurr,bitIndex51)<->v2347(VarCurr,bitIndex36))& (v2338(VarCurr,bitIndex50)<->v2347(VarCurr,bitIndex35))& (v2338(VarCurr,bitIndex49)<->v2347(VarCurr,bitIndex34))& (v2338(VarCurr,bitIndex48)<->v2347(VarCurr,bitIndex33))& (v2338(VarCurr,bitIndex47)<->v2347(VarCurr,bitIndex32))& (v2338(VarCurr,bitIndex46)<->v2347(VarCurr,bitIndex31))& (v2338(VarCurr,bitIndex45)<->v2347(VarCurr,bitIndex30))& (v2338(VarCurr,bitIndex44)<->v2347(VarCurr,bitIndex29))& (v2338(VarCurr,bitIndex43)<->v2347(VarCurr,bitIndex28))& (v2338(VarCurr,bitIndex42)<->v2347(VarCurr,bitIndex27))& (v2338(VarCurr,bitIndex41)<->v2347(VarCurr,bitIndex26))& (v2338(VarCurr,bitIndex40)<->v2347(VarCurr,bitIndex25))& (v2338(VarCurr,bitIndex39)<->v2347(VarCurr,bitIndex24))& (v2338(VarCurr,bitIndex38)<->v2347(VarCurr,bitIndex23))& (v2338(VarCurr,bitIndex37)<->v2347(VarCurr,bitIndex22))& (v2338(VarCurr,bitIndex36)<->v2347(VarCurr,bitIndex21))& (v2338(VarCurr,bitIndex35)<->v2347(VarCurr,bitIndex20))& (v2338(VarCurr,bitIndex34)<->v2347(VarCurr,bitIndex19))& (v2338(VarCurr,bitIndex33)<->v2347(VarCurr,bitIndex18))& (v2338(VarCurr,bitIndex32)<->v2347(VarCurr,bitIndex17))& (v2338(VarCurr,bitIndex31)<->v2347(VarCurr,bitIndex16))& (v2338(VarCurr,bitIndex30)<->v2347(VarCurr,bitIndex15))& (v2338(VarCurr,bitIndex29)<->v2347(VarCurr,bitIndex14))& (v2338(VarCurr,bitIndex28)<->v2347(VarCurr,bitIndex13))& (v2338(VarCurr,bitIndex27)<->v2347(VarCurr,bitIndex12))& (v2338(VarCurr,bitIndex26)<->v2347(VarCurr,bitIndex11))& (v2338(VarCurr,bitIndex25)<->v2347(VarCurr,bitIndex10))& (v2338(VarCurr,bitIndex24)<->v2347(VarCurr,bitIndex9))& (v2338(VarCurr,bitIndex23)<->v2347(VarCurr,bitIndex8))& (v2338(VarCurr,bitIndex22)<->v2347(VarCurr,bitIndex7))& (v2338(VarCurr,bitIndex21)<->v2347(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex20)<->v2347(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex19)<->v2347(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex18)<->v2347(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex17)<->v2347(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex16)<->v2347(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex15)<->v2347(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex80)<->v2346(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex79)<->v2346(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex78)<->v2346(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex77)<->v2346(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex84)<->v2345(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex83)<->v2345(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex82)<->v2345(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex81)<->v2345(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex92)<->v2344(VarCurr,bitIndex7))& (v2338(VarCurr,bitIndex91)<->v2344(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex90)<->v2344(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex89)<->v2344(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex88)<->v2344(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex87)<->v2344(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex86)<->v2344(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex85)<->v2344(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex108)<->v2343(VarCurr,bitIndex15))& (v2338(VarCurr,bitIndex107)<->v2343(VarCurr,bitIndex14))& (v2338(VarCurr,bitIndex106)<->v2343(VarCurr,bitIndex13))& (v2338(VarCurr,bitIndex105)<->v2343(VarCurr,bitIndex12))& (v2338(VarCurr,bitIndex104)<->v2343(VarCurr,bitIndex11))& (v2338(VarCurr,bitIndex103)<->v2343(VarCurr,bitIndex10))& (v2338(VarCurr,bitIndex102)<->v2343(VarCurr,bitIndex9))& (v2338(VarCurr,bitIndex101)<->v2343(VarCurr,bitIndex8))& (v2338(VarCurr,bitIndex100)<->v2343(VarCurr,bitIndex7))& (v2338(VarCurr,bitIndex99)<->v2343(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex98)<->v2343(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex97)<->v2343(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex96)<->v2343(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex95)<->v2343(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex94)<->v2343(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex93)<->v2343(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex118)<->v2342(VarCurr,bitIndex9))& (v2338(VarCurr,bitIndex117)<->v2342(VarCurr,bitIndex8))& (v2338(VarCurr,bitIndex116)<->v2342(VarCurr,bitIndex7))& (v2338(VarCurr,bitIndex115)<->v2342(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex114)<->v2342(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex113)<->v2342(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex112)<->v2342(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex111)<->v2342(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex110)<->v2342(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex109)<->v2342(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex120)<->v2341(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex119)<->v2341(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex123)<->v2340(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex122)<->v2340(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex121)<->v2340(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2338(VarCurr,bitIndex130)<->v2339(VarCurr,bitIndex6))& (v2338(VarCurr,bitIndex129)<->v2339(VarCurr,bitIndex5))& (v2338(VarCurr,bitIndex128)<->v2339(VarCurr,bitIndex4))& (v2338(VarCurr,bitIndex127)<->v2339(VarCurr,bitIndex3))& (v2338(VarCurr,bitIndex126)<->v2339(VarCurr,bitIndex2))& (v2338(VarCurr,bitIndex125)<->v2339(VarCurr,bitIndex1))& (v2338(VarCurr,bitIndex124)<->v2339(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2143(VarCurr,B)<->v2315(VarCurr,B))).
% 121.63/120.63  all VarCurr B (range_7_0(B)-> (v2315(VarCurr,B)<->v2145(VarCurr,B))).
% 121.63/120.63  all VarCurr ((v2315(VarCurr,bitIndex14)<->v2334(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex13)<->v2334(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex12)<->v2334(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex11)<->v2334(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex10)<->v2334(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex9)<->v2334(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex8)<->v2334(VarCurr,bitIndex0))).
% 121.63/120.63  all VarCurr ((v2315(VarCurr,bitIndex76)<->v2332(VarCurr,bitIndex61))& (v2315(VarCurr,bitIndex75)<->v2332(VarCurr,bitIndex60))& (v2315(VarCurr,bitIndex74)<->v2332(VarCurr,bitIndex59))& (v2315(VarCurr,bitIndex73)<->v2332(VarCurr,bitIndex58))& (v2315(VarCurr,bitIndex72)<->v2332(VarCurr,bitIndex57))& (v2315(VarCurr,bitIndex71)<->v2332(VarCurr,bitIndex56))& (v2315(VarCurr,bitIndex70)<->v2332(VarCurr,bitIndex55))& (v2315(VarCurr,bitIndex69)<->v2332(VarCurr,bitIndex54))& (v2315(VarCurr,bitIndex68)<->v2332(VarCurr,bitIndex53))& (v2315(VarCurr,bitIndex67)<->v2332(VarCurr,bitIndex52))& (v2315(VarCurr,bitIndex66)<->v2332(VarCurr,bitIndex51))& (v2315(VarCurr,bitIndex65)<->v2332(VarCurr,bitIndex50))& (v2315(VarCurr,bitIndex64)<->v2332(VarCurr,bitIndex49))& (v2315(VarCurr,bitIndex63)<->v2332(VarCurr,bitIndex48))& (v2315(VarCurr,bitIndex62)<->v2332(VarCurr,bitIndex47))& (v2315(VarCurr,bitIndex61)<->v2332(VarCurr,bitIndex46))& (v2315(VarCurr,bitIndex60)<->v2332(VarCurr,bitIndex45))& (v2315(VarCurr,bitIndex59)<->v2332(VarCurr,bitIndex44))& (v2315(VarCurr,bitIndex58)<->v2332(VarCurr,bitIndex43))& (v2315(VarCurr,bitIndex57)<->v2332(VarCurr,bitIndex42))& (v2315(VarCurr,bitIndex56)<->v2332(VarCurr,bitIndex41))& (v2315(VarCurr,bitIndex55)<->v2332(VarCurr,bitIndex40))& (v2315(VarCurr,bitIndex54)<->v2332(VarCurr,bitIndex39))& (v2315(VarCurr,bitIndex53)<->v2332(VarCurr,bitIndex38))& (v2315(VarCurr,bitIndex52)<->v2332(VarCurr,bitIndex37))& (v2315(VarCurr,bitIndex51)<->v2332(VarCurr,bitIndex36))& (v2315(VarCurr,bitIndex50)<->v2332(VarCurr,bitIndex35))& (v2315(VarCurr,bitIndex49)<->v2332(VarCurr,bitIndex34))& (v2315(VarCurr,bitIndex48)<->v2332(VarCurr,bitIndex33))& (v2315(VarCurr,bitIndex47)<->v2332(VarCurr,bitIndex32))& (v2315(VarCurr,bitIndex46)<->v2332(VarCurr,bitIndex31))& (v2315(VarCurr,bitIndex45)<->v2332(VarCurr,bitIndex30))& (v2315(VarCurr,bitIndex44)<->v2332(VarCurr,bitIndex29))& (v2315(VarCurr,bitIndex43)<->v2332(VarCurr,bitIndex28))& (v2315(VarCurr,bitIndex42)<->v2332(VarCurr,bitIndex27))& (v2315(VarCurr,bitIndex41)<->v2332(VarCurr,bitIndex26))& (v2315(VarCurr,bitIndex40)<->v2332(VarCurr,bitIndex25))& (v2315(VarCurr,bitIndex39)<->v2332(VarCurr,bitIndex24))& (v2315(VarCurr,bitIndex38)<->v2332(VarCurr,bitIndex23))& (v2315(VarCurr,bitIndex37)<->v2332(VarCurr,bitIndex22))& (v2315(VarCurr,bitIndex36)<->v2332(VarCurr,bitIndex21))& (v2315(VarCurr,bitIndex35)<->v2332(VarCurr,bitIndex20))& (v2315(VarCurr,bitIndex34)<->v2332(VarCurr,bitIndex19))& (v2315(VarCurr,bitIndex33)<->v2332(VarCurr,bitIndex18))& (v2315(VarCurr,bitIndex32)<->v2332(VarCurr,bitIndex17))& (v2315(VarCurr,bitIndex31)<->v2332(VarCurr,bitIndex16))& (v2315(VarCurr,bitIndex30)<->v2332(VarCurr,bitIndex15))& (v2315(VarCurr,bitIndex29)<->v2332(VarCurr,bitIndex14))& (v2315(VarCurr,bitIndex28)<->v2332(VarCurr,bitIndex13))& (v2315(VarCurr,bitIndex27)<->v2332(VarCurr,bitIndex12))& (v2315(VarCurr,bitIndex26)<->v2332(VarCurr,bitIndex11))& (v2315(VarCurr,bitIndex25)<->v2332(VarCurr,bitIndex10))& (v2315(VarCurr,bitIndex24)<->v2332(VarCurr,bitIndex9))& (v2315(VarCurr,bitIndex23)<->v2332(VarCurr,bitIndex8))& (v2315(VarCurr,bitIndex22)<->v2332(VarCurr,bitIndex7))& (v2315(VarCurr,bitIndex21)<->v2332(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex20)<->v2332(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex19)<->v2332(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex18)<->v2332(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex17)<->v2332(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex16)<->v2332(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex15)<->v2332(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex80)<->v2330(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex79)<->v2330(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex78)<->v2330(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex77)<->v2330(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex84)<->v2328(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex83)<->v2328(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex82)<->v2328(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex81)<->v2328(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex92)<->v2326(VarCurr,bitIndex7))& (v2315(VarCurr,bitIndex91)<->v2326(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex90)<->v2326(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex89)<->v2326(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex88)<->v2326(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex87)<->v2326(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex86)<->v2326(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex85)<->v2326(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex108)<->v2324(VarCurr,bitIndex15))& (v2315(VarCurr,bitIndex107)<->v2324(VarCurr,bitIndex14))& (v2315(VarCurr,bitIndex106)<->v2324(VarCurr,bitIndex13))& (v2315(VarCurr,bitIndex105)<->v2324(VarCurr,bitIndex12))& (v2315(VarCurr,bitIndex104)<->v2324(VarCurr,bitIndex11))& (v2315(VarCurr,bitIndex103)<->v2324(VarCurr,bitIndex10))& (v2315(VarCurr,bitIndex102)<->v2324(VarCurr,bitIndex9))& (v2315(VarCurr,bitIndex101)<->v2324(VarCurr,bitIndex8))& (v2315(VarCurr,bitIndex100)<->v2324(VarCurr,bitIndex7))& (v2315(VarCurr,bitIndex99)<->v2324(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex98)<->v2324(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex97)<->v2324(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex96)<->v2324(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex95)<->v2324(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex94)<->v2324(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex93)<->v2324(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex118)<->v2322(VarCurr,bitIndex9))& (v2315(VarCurr,bitIndex117)<->v2322(VarCurr,bitIndex8))& (v2315(VarCurr,bitIndex116)<->v2322(VarCurr,bitIndex7))& (v2315(VarCurr,bitIndex115)<->v2322(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex114)<->v2322(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex113)<->v2322(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex112)<->v2322(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex111)<->v2322(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex110)<->v2322(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex109)<->v2322(VarCurr,bitIndex0))).
% 121.63/120.64  all VarCurr ((v2315(VarCurr,bitIndex120)<->v2320(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex119)<->v2320(VarCurr,bitIndex0))).
% 121.63/120.65  all VarCurr ((v2315(VarCurr,bitIndex123)<->v2318(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex122)<->v2318(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex121)<->v2318(VarCurr,bitIndex0))).
% 121.63/120.65  all VarCurr ((v2315(VarCurr,bitIndex130)<->v2316(VarCurr,bitIndex6))& (v2315(VarCurr,bitIndex129)<->v2316(VarCurr,bitIndex5))& (v2315(VarCurr,bitIndex128)<->v2316(VarCurr,bitIndex4))& (v2315(VarCurr,bitIndex127)<->v2316(VarCurr,bitIndex3))& (v2315(VarCurr,bitIndex126)<->v2316(VarCurr,bitIndex2))& (v2315(VarCurr,bitIndex125)<->v2316(VarCurr,bitIndex1))& (v2315(VarCurr,bitIndex124)<->v2316(VarCurr,bitIndex0))).
% 121.63/120.65  all VarCurr B (range_4_0(B)-> (v2326(VarCurr,B)<->v2327(VarCurr,B))).
% 121.63/120.65  all VarCurr ((v2326(VarCurr,bitIndex7)<->$F)& (v2326(VarCurr,bitIndex6)<->$F)& (v2326(VarCurr,bitIndex5)<->$F)).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2300(VarNext)-> (all B (range_7_0(B)-> (v2145(VarNext,B)<->v2145(VarCurr,B)))))).
% 121.63/120.65  all VarNext (v2300(VarNext)-> (all B (range_7_0(B)-> (v2145(VarNext,B)<->v2312(VarNext,B))))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_7_0(B)-> (v2312(VarNext,B)<->v2310(VarCurr,B))))).
% 121.63/120.65  all VarCurr (-v2309(VarCurr)-> (all B (range_7_0(B)-> (v2310(VarCurr,B)<->v2147(VarCurr,B))))).
% 121.63/120.65  all VarCurr (v2309(VarCurr)-> (all B (range_7_0(B)-> (v2310(VarCurr,B)<->$F)))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2300(VarNext)<->v2301(VarNext)&v2308(VarNext))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2308(VarNext)<->v2306(VarCurr))).
% 121.63/120.65  all VarCurr (v2306(VarCurr)<->v2297(VarCurr)|v2309(VarCurr)).
% 121.63/120.65  all VarCurr (-v2309(VarCurr)<->v8(VarCurr)).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2301(VarNext)<->v2302(VarNext)&v1252(VarNext))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2302(VarNext)<->v1259(VarNext))).
% 121.63/120.65  all B (range_7_0(B)-> (v2145(constB0,B)<->$F)).
% 121.63/120.65  all VarCurr (v2297(VarCurr)<->v6(VarCurr,bitIndex1)).
% 121.63/120.65  all VarCurr B (range_7_0(B)-> (v2147(VarCurr,B)<->v2149(VarCurr,B))).
% 121.63/120.65  all VarCurr B (range_7_0(B)-> (v2149(VarCurr,B)<->v2151(VarCurr,B))).
% 121.63/120.65  all VarCurr ((v2151(VarCurr,bitIndex7)<->v2153(VarCurr,bitIndex400))& (v2151(VarCurr,bitIndex6)<->v2153(VarCurr,bitIndex399))& (v2151(VarCurr,bitIndex5)<->v2153(VarCurr,bitIndex398))& (v2151(VarCurr,bitIndex4)<->v2153(VarCurr,bitIndex397))& (v2151(VarCurr,bitIndex3)<->v2153(VarCurr,bitIndex396))& (v2151(VarCurr,bitIndex2)<->v2153(VarCurr,bitIndex395))& (v2151(VarCurr,bitIndex1)<->v2153(VarCurr,bitIndex394))& (v2151(VarCurr,bitIndex0)<->v2153(VarCurr,bitIndex393))).
% 121.63/120.65  all VarNext ((v2153(VarNext,bitIndex400)<->v2264(VarNext,bitIndex7))& (v2153(VarNext,bitIndex399)<->v2264(VarNext,bitIndex6))& (v2153(VarNext,bitIndex398)<->v2264(VarNext,bitIndex5))& (v2153(VarNext,bitIndex397)<->v2264(VarNext,bitIndex4))& (v2153(VarNext,bitIndex396)<->v2264(VarNext,bitIndex3))& (v2153(VarNext,bitIndex395)<->v2264(VarNext,bitIndex2))& (v2153(VarNext,bitIndex394)<->v2264(VarNext,bitIndex1))& (v2153(VarNext,bitIndex393)<->v2264(VarNext,bitIndex0))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2265(VarNext)-> (v2264(VarNext,bitIndex130)<->v2153(VarCurr,bitIndex523))& (v2264(VarNext,bitIndex129)<->v2153(VarCurr,bitIndex522))& (v2264(VarNext,bitIndex128)<->v2153(VarCurr,bitIndex521))& (v2264(VarNext,bitIndex127)<->v2153(VarCurr,bitIndex520))& (v2264(VarNext,bitIndex126)<->v2153(VarCurr,bitIndex519))& (v2264(VarNext,bitIndex125)<->v2153(VarCurr,bitIndex518))& (v2264(VarNext,bitIndex124)<->v2153(VarCurr,bitIndex517))& (v2264(VarNext,bitIndex123)<->v2153(VarCurr,bitIndex516))& (v2264(VarNext,bitIndex122)<->v2153(VarCurr,bitIndex515))& (v2264(VarNext,bitIndex121)<->v2153(VarCurr,bitIndex514))& (v2264(VarNext,bitIndex120)<->v2153(VarCurr,bitIndex513))& (v2264(VarNext,bitIndex119)<->v2153(VarCurr,bitIndex512))& (v2264(VarNext,bitIndex118)<->v2153(VarCurr,bitIndex511))& (v2264(VarNext,bitIndex117)<->v2153(VarCurr,bitIndex510))& (v2264(VarNext,bitIndex116)<->v2153(VarCurr,bitIndex509))& (v2264(VarNext,bitIndex115)<->v2153(VarCurr,bitIndex508))& (v2264(VarNext,bitIndex114)<->v2153(VarCurr,bitIndex507))& (v2264(VarNext,bitIndex113)<->v2153(VarCurr,bitIndex506))& (v2264(VarNext,bitIndex112)<->v2153(VarCurr,bitIndex505))& (v2264(VarNext,bitIndex111)<->v2153(VarCurr,bitIndex504))& (v2264(VarNext,bitIndex110)<->v2153(VarCurr,bitIndex503))& (v2264(VarNext,bitIndex109)<->v2153(VarCurr,bitIndex502))& (v2264(VarNext,bitIndex108)<->v2153(VarCurr,bitIndex501))& (v2264(VarNext,bitIndex107)<->v2153(VarCurr,bitIndex500))& (v2264(VarNext,bitIndex106)<->v2153(VarCurr,bitIndex499))& (v2264(VarNext,bitIndex105)<->v2153(VarCurr,bitIndex498))& (v2264(VarNext,bitIndex104)<->v2153(VarCurr,bitIndex497))& (v2264(VarNext,bitIndex103)<->v2153(VarCurr,bitIndex496))& (v2264(VarNext,bitIndex102)<->v2153(VarCurr,bitIndex495))& (v2264(VarNext,bitIndex101)<->v2153(VarCurr,bitIndex494))& (v2264(VarNext,bitIndex100)<->v2153(VarCurr,bitIndex493))& (v2264(VarNext,bitIndex99)<->v2153(VarCurr,bitIndex492))& (v2264(VarNext,bitIndex98)<->v2153(VarCurr,bitIndex491))& (v2264(VarNext,bitIndex97)<->v2153(VarCurr,bitIndex490))& (v2264(VarNext,bitIndex96)<->v2153(VarCurr,bitIndex489))& (v2264(VarNext,bitIndex95)<->v2153(VarCurr,bitIndex488))& (v2264(VarNext,bitIndex94)<->v2153(VarCurr,bitIndex487))& (v2264(VarNext,bitIndex93)<->v2153(VarCurr,bitIndex486))& (v2264(VarNext,bitIndex92)<->v2153(VarCurr,bitIndex485))& (v2264(VarNext,bitIndex91)<->v2153(VarCurr,bitIndex484))& (v2264(VarNext,bitIndex90)<->v2153(VarCurr,bitIndex483))& (v2264(VarNext,bitIndex89)<->v2153(VarCurr,bitIndex482))& (v2264(VarNext,bitIndex88)<->v2153(VarCurr,bitIndex481))& (v2264(VarNext,bitIndex87)<->v2153(VarCurr,bitIndex480))& (v2264(VarNext,bitIndex86)<->v2153(VarCurr,bitIndex479))& (v2264(VarNext,bitIndex85)<->v2153(VarCurr,bitIndex478))& (v2264(VarNext,bitIndex84)<->v2153(VarCurr,bitIndex477))& (v2264(VarNext,bitIndex83)<->v2153(VarCurr,bitIndex476))& (v2264(VarNext,bitIndex82)<->v2153(VarCurr,bitIndex475))& (v2264(VarNext,bitIndex81)<->v2153(VarCurr,bitIndex474))& (v2264(VarNext,bitIndex80)<->v2153(VarCurr,bitIndex473))& (v2264(VarNext,bitIndex79)<->v2153(VarCurr,bitIndex472))& (v2264(VarNext,bitIndex78)<->v2153(VarCurr,bitIndex471))& (v2264(VarNext,bitIndex77)<->v2153(VarCurr,bitIndex470))& (v2264(VarNext,bitIndex76)<->v2153(VarCurr,bitIndex469))& (v2264(VarNext,bitIndex75)<->v2153(VarCurr,bitIndex468))& (v2264(VarNext,bitIndex74)<->v2153(VarCurr,bitIndex467))& (v2264(VarNext,bitIndex73)<->v2153(VarCurr,bitIndex466))& (v2264(VarNext,bitIndex72)<->v2153(VarCurr,bitIndex465))& (v2264(VarNext,bitIndex71)<->v2153(VarCurr,bitIndex464))& (v2264(VarNext,bitIndex70)<->v2153(VarCurr,bitIndex463))& (v2264(VarNext,bitIndex69)<->v2153(VarCurr,bitIndex462))& (v2264(VarNext,bitIndex68)<->v2153(VarCurr,bitIndex461))& (v2264(VarNext,bitIndex67)<->v2153(VarCurr,bitIndex460))& (v2264(VarNext,bitIndex66)<->v2153(VarCurr,bitIndex459))& (v2264(VarNext,bitIndex65)<->v2153(VarCurr,bitIndex458))& (v2264(VarNext,bitIndex64)<->v2153(VarCurr,bitIndex457))& (v2264(VarNext,bitIndex63)<->v2153(VarCurr,bitIndex456))& (v2264(VarNext,bitIndex62)<->v2153(VarCurr,bitIndex455))& (v2264(VarNext,bitIndex61)<->v2153(VarCurr,bitIndex454))& (v2264(VarNext,bitIndex60)<->v2153(VarCurr,bitIndex453))& (v2264(VarNext,bitIndex59)<->v2153(VarCurr,bitIndex452))& (v2264(VarNext,bitIndex58)<->v2153(VarCurr,bitIndex451))& (v2264(VarNext,bitIndex57)<->v2153(VarCurr,bitIndex450))& (v2264(VarNext,bitIndex56)<->v2153(VarCurr,bitIndex449))& (v2264(VarNext,bitIndex55)<->v2153(VarCurr,bitIndex448))& (v2264(VarNext,bitIndex54)<->v2153(VarCurr,bitIndex447))& (v2264(VarNext,bitIndex53)<->v2153(VarCurr,bitIndex446))& (v2264(VarNext,bitIndex52)<->v2153(VarCurr,bitIndex445))& (v2264(VarNext,bitIndex51)<->v2153(VarCurr,bitIndex444))& (v2264(VarNext,bitIndex50)<->v2153(VarCurr,bitIndex443))& (v2264(VarNext,bitIndex49)<->v2153(VarCurr,bitIndex442))& (v2264(VarNext,bitIndex48)<->v2153(VarCurr,bitIndex441))& (v2264(VarNext,bitIndex47)<->v2153(VarCurr,bitIndex440))& (v2264(VarNext,bitIndex46)<->v2153(VarCurr,bitIndex439))& (v2264(VarNext,bitIndex45)<->v2153(VarCurr,bitIndex438))& (v2264(VarNext,bitIndex44)<->v2153(VarCurr,bitIndex437))& (v2264(VarNext,bitIndex43)<->v2153(VarCurr,bitIndex436))& (v2264(VarNext,bitIndex42)<->v2153(VarCurr,bitIndex435))& (v2264(VarNext,bitIndex41)<->v2153(VarCurr,bitIndex434))& (v2264(VarNext,bitIndex40)<->v2153(VarCurr,bitIndex433))& (v2264(VarNext,bitIndex39)<->v2153(VarCurr,bitIndex432))& (v2264(VarNext,bitIndex38)<->v2153(VarCurr,bitIndex431))& (v2264(VarNext,bitIndex37)<->v2153(VarCurr,bitIndex430))& (v2264(VarNext,bitIndex36)<->v2153(VarCurr,bitIndex429))& (v2264(VarNext,bitIndex35)<->v2153(VarCurr,bitIndex428))& (v2264(VarNext,bitIndex34)<->v2153(VarCurr,bitIndex427))& (v2264(VarNext,bitIndex33)<->v2153(VarCurr,bitIndex426))& (v2264(VarNext,bitIndex32)<->v2153(VarCurr,bitIndex425))& (v2264(VarNext,bitIndex31)<->v2153(VarCurr,bitIndex424))& (v2264(VarNext,bitIndex30)<->v2153(VarCurr,bitIndex423))& (v2264(VarNext,bitIndex29)<->v2153(VarCurr,bitIndex422))& (v2264(VarNext,bitIndex28)<->v2153(VarCurr,bitIndex421))& (v2264(VarNext,bitIndex27)<->v2153(VarCurr,bitIndex420))& (v2264(VarNext,bitIndex26)<->v2153(VarCurr,bitIndex419))& (v2264(VarNext,bitIndex25)<->v2153(VarCurr,bitIndex418))& (v2264(VarNext,bitIndex24)<->v2153(VarCurr,bitIndex417))& (v2264(VarNext,bitIndex23)<->v2153(VarCurr,bitIndex416))& (v2264(VarNext,bitIndex22)<->v2153(VarCurr,bitIndex415))& (v2264(VarNext,bitIndex21)<->v2153(VarCurr,bitIndex414))& (v2264(VarNext,bitIndex20)<->v2153(VarCurr,bitIndex413))& (v2264(VarNext,bitIndex19)<->v2153(VarCurr,bitIndex412))& (v2264(VarNext,bitIndex18)<->v2153(VarCurr,bitIndex411))& (v2264(VarNext,bitIndex17)<->v2153(VarCurr,bitIndex410))& (v2264(VarNext,bitIndex16)<->v2153(VarCurr,bitIndex409))& (v2264(VarNext,bitIndex15)<->v2153(VarCurr,bitIndex408))& (v2264(VarNext,bitIndex14)<->v2153(VarCurr,bitIndex407))& (v2264(VarNext,bitIndex13)<->v2153(VarCurr,bitIndex406))& (v2264(VarNext,bitIndex12)<->v2153(VarCurr,bitIndex405))& (v2264(VarNext,bitIndex11)<->v2153(VarCurr,bitIndex404))& (v2264(VarNext,bitIndex10)<->v2153(VarCurr,bitIndex403))& (v2264(VarNext,bitIndex9)<->v2153(VarCurr,bitIndex402))& (v2264(VarNext,bitIndex8)<->v2153(VarCurr,bitIndex401))& (v2264(VarNext,bitIndex7)<->v2153(VarCurr,bitIndex400))& (v2264(VarNext,bitIndex6)<->v2153(VarCurr,bitIndex399))& (v2264(VarNext,bitIndex5)<->v2153(VarCurr,bitIndex398))& (v2264(VarNext,bitIndex4)<->v2153(VarCurr,bitIndex397))& (v2264(VarNext,bitIndex3)<->v2153(VarCurr,bitIndex396))& (v2264(VarNext,bitIndex2)<->v2153(VarCurr,bitIndex395))& (v2264(VarNext,bitIndex1)<->v2153(VarCurr,bitIndex394))& (v2264(VarNext,bitIndex0)<->v2153(VarCurr,bitIndex393)))).
% 121.63/120.65  all VarNext (v2265(VarNext)-> (all B (range_130_0(B)-> (v2264(VarNext,B)<->v2292(VarNext,B))))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v2292(VarNext,B)<->v2290(VarCurr,B))))).
% 121.63/120.65  all VarCurr (-v2275(VarCurr)-> (all B (range_130_0(B)-> (v2290(VarCurr,B)<->v2293(VarCurr,B))))).
% 121.63/120.65  all VarCurr (v2275(VarCurr)-> (all B (range_130_0(B)-> (v2290(VarCurr,B)<->$F)))).
% 121.63/120.65  all VarCurr (-v2279(VarCurr)& -v2281(VarCurr)-> (all B (range_130_0(B)-> (v2293(VarCurr,B)<->v2257(VarCurr,B))))).
% 121.63/120.65  all VarCurr (v2281(VarCurr)-> (all B (range_130_0(B)-> (v2293(VarCurr,B)<->v2163(VarCurr,B))))).
% 121.63/120.65  all VarCurr (v2279(VarCurr)-> (v2293(VarCurr,bitIndex130)<->v2153(VarCurr,bitIndex392))& (v2293(VarCurr,bitIndex129)<->v2153(VarCurr,bitIndex391))& (v2293(VarCurr,bitIndex128)<->v2153(VarCurr,bitIndex390))& (v2293(VarCurr,bitIndex127)<->v2153(VarCurr,bitIndex389))& (v2293(VarCurr,bitIndex126)<->v2153(VarCurr,bitIndex388))& (v2293(VarCurr,bitIndex125)<->v2153(VarCurr,bitIndex387))& (v2293(VarCurr,bitIndex124)<->v2153(VarCurr,bitIndex386))& (v2293(VarCurr,bitIndex123)<->v2153(VarCurr,bitIndex385))& (v2293(VarCurr,bitIndex122)<->v2153(VarCurr,bitIndex384))& (v2293(VarCurr,bitIndex121)<->v2153(VarCurr,bitIndex383))& (v2293(VarCurr,bitIndex120)<->v2153(VarCurr,bitIndex382))& (v2293(VarCurr,bitIndex119)<->v2153(VarCurr,bitIndex381))& (v2293(VarCurr,bitIndex118)<->v2153(VarCurr,bitIndex380))& (v2293(VarCurr,bitIndex117)<->v2153(VarCurr,bitIndex379))& (v2293(VarCurr,bitIndex116)<->v2153(VarCurr,bitIndex378))& (v2293(VarCurr,bitIndex115)<->v2153(VarCurr,bitIndex377))& (v2293(VarCurr,bitIndex114)<->v2153(VarCurr,bitIndex376))& (v2293(VarCurr,bitIndex113)<->v2153(VarCurr,bitIndex375))& (v2293(VarCurr,bitIndex112)<->v2153(VarCurr,bitIndex374))& (v2293(VarCurr,bitIndex111)<->v2153(VarCurr,bitIndex373))& (v2293(VarCurr,bitIndex110)<->v2153(VarCurr,bitIndex372))& (v2293(VarCurr,bitIndex109)<->v2153(VarCurr,bitIndex371))& (v2293(VarCurr,bitIndex108)<->v2153(VarCurr,bitIndex370))& (v2293(VarCurr,bitIndex107)<->v2153(VarCurr,bitIndex369))& (v2293(VarCurr,bitIndex106)<->v2153(VarCurr,bitIndex368))& (v2293(VarCurr,bitIndex105)<->v2153(VarCurr,bitIndex367))& (v2293(VarCurr,bitIndex104)<->v2153(VarCurr,bitIndex366))& (v2293(VarCurr,bitIndex103)<->v2153(VarCurr,bitIndex365))& (v2293(VarCurr,bitIndex102)<->v2153(VarCurr,bitIndex364))& (v2293(VarCurr,bitIndex101)<->v2153(VarCurr,bitIndex363))& (v2293(VarCurr,bitIndex100)<->v2153(VarCurr,bitIndex362))& (v2293(VarCurr,bitIndex99)<->v2153(VarCurr,bitIndex361))& (v2293(VarCurr,bitIndex98)<->v2153(VarCurr,bitIndex360))& (v2293(VarCurr,bitIndex97)<->v2153(VarCurr,bitIndex359))& (v2293(VarCurr,bitIndex96)<->v2153(VarCurr,bitIndex358))& (v2293(VarCurr,bitIndex95)<->v2153(VarCurr,bitIndex357))& (v2293(VarCurr,bitIndex94)<->v2153(VarCurr,bitIndex356))& (v2293(VarCurr,bitIndex93)<->v2153(VarCurr,bitIndex355))& (v2293(VarCurr,bitIndex92)<->v2153(VarCurr,bitIndex354))& (v2293(VarCurr,bitIndex91)<->v2153(VarCurr,bitIndex353))& (v2293(VarCurr,bitIndex90)<->v2153(VarCurr,bitIndex352))& (v2293(VarCurr,bitIndex89)<->v2153(VarCurr,bitIndex351))& (v2293(VarCurr,bitIndex88)<->v2153(VarCurr,bitIndex350))& (v2293(VarCurr,bitIndex87)<->v2153(VarCurr,bitIndex349))& (v2293(VarCurr,bitIndex86)<->v2153(VarCurr,bitIndex348))& (v2293(VarCurr,bitIndex85)<->v2153(VarCurr,bitIndex347))& (v2293(VarCurr,bitIndex84)<->v2153(VarCurr,bitIndex346))& (v2293(VarCurr,bitIndex83)<->v2153(VarCurr,bitIndex345))& (v2293(VarCurr,bitIndex82)<->v2153(VarCurr,bitIndex344))& (v2293(VarCurr,bitIndex81)<->v2153(VarCurr,bitIndex343))& (v2293(VarCurr,bitIndex80)<->v2153(VarCurr,bitIndex342))& (v2293(VarCurr,bitIndex79)<->v2153(VarCurr,bitIndex341))& (v2293(VarCurr,bitIndex78)<->v2153(VarCurr,bitIndex340))& (v2293(VarCurr,bitIndex77)<->v2153(VarCurr,bitIndex339))& (v2293(VarCurr,bitIndex76)<->v2153(VarCurr,bitIndex338))& (v2293(VarCurr,bitIndex75)<->v2153(VarCurr,bitIndex337))& (v2293(VarCurr,bitIndex74)<->v2153(VarCurr,bitIndex336))& (v2293(VarCurr,bitIndex73)<->v2153(VarCurr,bitIndex335))& (v2293(VarCurr,bitIndex72)<->v2153(VarCurr,bitIndex334))& (v2293(VarCurr,bitIndex71)<->v2153(VarCurr,bitIndex333))& (v2293(VarCurr,bitIndex70)<->v2153(VarCurr,bitIndex332))& (v2293(VarCurr,bitIndex69)<->v2153(VarCurr,bitIndex331))& (v2293(VarCurr,bitIndex68)<->v2153(VarCurr,bitIndex330))& (v2293(VarCurr,bitIndex67)<->v2153(VarCurr,bitIndex329))& (v2293(VarCurr,bitIndex66)<->v2153(VarCurr,bitIndex328))& (v2293(VarCurr,bitIndex65)<->v2153(VarCurr,bitIndex327))& (v2293(VarCurr,bitIndex64)<->v2153(VarCurr,bitIndex326))& (v2293(VarCurr,bitIndex63)<->v2153(VarCurr,bitIndex325))& (v2293(VarCurr,bitIndex62)<->v2153(VarCurr,bitIndex324))& (v2293(VarCurr,bitIndex61)<->v2153(VarCurr,bitIndex323))& (v2293(VarCurr,bitIndex60)<->v2153(VarCurr,bitIndex322))& (v2293(VarCurr,bitIndex59)<->v2153(VarCurr,bitIndex321))& (v2293(VarCurr,bitIndex58)<->v2153(VarCurr,bitIndex320))& (v2293(VarCurr,bitIndex57)<->v2153(VarCurr,bitIndex319))& (v2293(VarCurr,bitIndex56)<->v2153(VarCurr,bitIndex318))& (v2293(VarCurr,bitIndex55)<->v2153(VarCurr,bitIndex317))& (v2293(VarCurr,bitIndex54)<->v2153(VarCurr,bitIndex316))& (v2293(VarCurr,bitIndex53)<->v2153(VarCurr,bitIndex315))& (v2293(VarCurr,bitIndex52)<->v2153(VarCurr,bitIndex314))& (v2293(VarCurr,bitIndex51)<->v2153(VarCurr,bitIndex313))& (v2293(VarCurr,bitIndex50)<->v2153(VarCurr,bitIndex312))& (v2293(VarCurr,bitIndex49)<->v2153(VarCurr,bitIndex311))& (v2293(VarCurr,bitIndex48)<->v2153(VarCurr,bitIndex310))& (v2293(VarCurr,bitIndex47)<->v2153(VarCurr,bitIndex309))& (v2293(VarCurr,bitIndex46)<->v2153(VarCurr,bitIndex308))& (v2293(VarCurr,bitIndex45)<->v2153(VarCurr,bitIndex307))& (v2293(VarCurr,bitIndex44)<->v2153(VarCurr,bitIndex306))& (v2293(VarCurr,bitIndex43)<->v2153(VarCurr,bitIndex305))& (v2293(VarCurr,bitIndex42)<->v2153(VarCurr,bitIndex304))& (v2293(VarCurr,bitIndex41)<->v2153(VarCurr,bitIndex303))& (v2293(VarCurr,bitIndex40)<->v2153(VarCurr,bitIndex302))& (v2293(VarCurr,bitIndex39)<->v2153(VarCurr,bitIndex301))& (v2293(VarCurr,bitIndex38)<->v2153(VarCurr,bitIndex300))& (v2293(VarCurr,bitIndex37)<->v2153(VarCurr,bitIndex299))& (v2293(VarCurr,bitIndex36)<->v2153(VarCurr,bitIndex298))& (v2293(VarCurr,bitIndex35)<->v2153(VarCurr,bitIndex297))& (v2293(VarCurr,bitIndex34)<->v2153(VarCurr,bitIndex296))& (v2293(VarCurr,bitIndex33)<->v2153(VarCurr,bitIndex295))& (v2293(VarCurr,bitIndex32)<->v2153(VarCurr,bitIndex294))& (v2293(VarCurr,bitIndex31)<->v2153(VarCurr,bitIndex293))& (v2293(VarCurr,bitIndex30)<->v2153(VarCurr,bitIndex292))& (v2293(VarCurr,bitIndex29)<->v2153(VarCurr,bitIndex291))& (v2293(VarCurr,bitIndex28)<->v2153(VarCurr,bitIndex290))& (v2293(VarCurr,bitIndex27)<->v2153(VarCurr,bitIndex289))& (v2293(VarCurr,bitIndex26)<->v2153(VarCurr,bitIndex288))& (v2293(VarCurr,bitIndex25)<->v2153(VarCurr,bitIndex287))& (v2293(VarCurr,bitIndex24)<->v2153(VarCurr,bitIndex286))& (v2293(VarCurr,bitIndex23)<->v2153(VarCurr,bitIndex285))& (v2293(VarCurr,bitIndex22)<->v2153(VarCurr,bitIndex284))& (v2293(VarCurr,bitIndex21)<->v2153(VarCurr,bitIndex283))& (v2293(VarCurr,bitIndex20)<->v2153(VarCurr,bitIndex282))& (v2293(VarCurr,bitIndex19)<->v2153(VarCurr,bitIndex281))& (v2293(VarCurr,bitIndex18)<->v2153(VarCurr,bitIndex280))& (v2293(VarCurr,bitIndex17)<->v2153(VarCurr,bitIndex279))& (v2293(VarCurr,bitIndex16)<->v2153(VarCurr,bitIndex278))& (v2293(VarCurr,bitIndex15)<->v2153(VarCurr,bitIndex277))& (v2293(VarCurr,bitIndex14)<->v2153(VarCurr,bitIndex276))& (v2293(VarCurr,bitIndex13)<->v2153(VarCurr,bitIndex275))& (v2293(VarCurr,bitIndex12)<->v2153(VarCurr,bitIndex274))& (v2293(VarCurr,bitIndex11)<->v2153(VarCurr,bitIndex273))& (v2293(VarCurr,bitIndex10)<->v2153(VarCurr,bitIndex272))& (v2293(VarCurr,bitIndex9)<->v2153(VarCurr,bitIndex271))& (v2293(VarCurr,bitIndex8)<->v2153(VarCurr,bitIndex270))& (v2293(VarCurr,bitIndex7)<->v2153(VarCurr,bitIndex269))& (v2293(VarCurr,bitIndex6)<->v2153(VarCurr,bitIndex268))& (v2293(VarCurr,bitIndex5)<->v2153(VarCurr,bitIndex267))& (v2293(VarCurr,bitIndex4)<->v2153(VarCurr,bitIndex266))& (v2293(VarCurr,bitIndex3)<->v2153(VarCurr,bitIndex265))& (v2293(VarCurr,bitIndex2)<->v2153(VarCurr,bitIndex264))& (v2293(VarCurr,bitIndex1)<->v2153(VarCurr,bitIndex263))& (v2293(VarCurr,bitIndex0)<->v2153(VarCurr,bitIndex262))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2265(VarNext)<->v2266(VarNext)&v2273(VarNext))).
% 121.63/120.65  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2273(VarNext)<->v2271(VarCurr))).
% 121.63/120.65  all VarCurr (v2271(VarCurr)<->v2274(VarCurr)&v2286(VarCurr)).
% 121.63/120.65  all VarCurr (v2286(VarCurr)<->v2287(VarCurr)|v2275(VarCurr)).
% 121.63/120.65  all VarCurr (-v2287(VarCurr)<->v2288(VarCurr)).
% 121.63/120.65  all VarCurr (v2288(VarCurr)<-> (v2289(VarCurr,bitIndex1)<->$F)& (v2289(VarCurr,bitIndex0)<->$F)).
% 121.63/120.65  all VarCurr (v2289(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.65  all VarCurr (v2289(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.65  all VarCurr (v2274(VarCurr)<->v2275(VarCurr)|v2276(VarCurr)).
% 121.63/120.65  all VarCurr (v2276(VarCurr)<->v2277(VarCurr)&v2285(VarCurr)).
% 121.63/120.65  all VarCurr (-v2285(VarCurr)<->v2275(VarCurr)).
% 121.63/120.65  all VarCurr (v2277(VarCurr)<->v2278(VarCurr)|v2283(VarCurr)).
% 121.63/120.65  all VarCurr (v2283(VarCurr)<-> (v2284(VarCurr,bitIndex1)<->$T)& (v2284(VarCurr,bitIndex0)<->$T)).
% 121.63/120.65  all VarCurr (v2284(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.65  all VarCurr (v2284(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.65  all VarCurr (v2278(VarCurr)<->v2279(VarCurr)|v2281(VarCurr)).
% 121.63/120.65  all VarCurr (v2281(VarCurr)<-> (v2282(VarCurr,bitIndex1)<->$T)& (v2282(VarCurr,bitIndex0)<->$F)).
% 121.63/120.65  all VarCurr (v2282(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.65  all VarCurr (v2282(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.65  all VarCurr (v2279(VarCurr)<-> (v2280(VarCurr,bitIndex1)<->$F)& (v2280(VarCurr,bitIndex0)<->$T)).
% 121.63/120.65  all VarCurr (v2280(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.65  all VarCurr (v2280(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.66  all VarCurr (-v2275(VarCurr)<->v27(VarCurr)).
% 121.63/120.66  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2266(VarNext)<->v2267(VarNext)&v2173(VarNext))).
% 121.63/120.66  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2267(VarNext)<->v2182(VarNext))).
% 121.63/120.66  all VarCurr B (range_7_0(B)-> (v2257(VarCurr,B)<->v2262(VarCurr,B))).
% 121.63/120.66  all VarCurr (-v2259(VarCurr)-> (all B (range_130_0(B)-> (v2262(VarCurr,B)<->v2261(VarCurr,B))))).
% 121.63/120.66  all VarCurr (v2259(VarCurr)-> (all B (range_130_0(B)-> (v2262(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.63/120.66  all VarCurr ((v2261(VarCurr,bitIndex7)<->v2153(VarCurr,bitIndex269))& (v2261(VarCurr,bitIndex6)<->v2153(VarCurr,bitIndex268))& (v2261(VarCurr,bitIndex5)<->v2153(VarCurr,bitIndex267))& (v2261(VarCurr,bitIndex4)<->v2153(VarCurr,bitIndex266))& (v2261(VarCurr,bitIndex3)<->v2153(VarCurr,bitIndex265))& (v2261(VarCurr,bitIndex2)<->v2153(VarCurr,bitIndex264))& (v2261(VarCurr,bitIndex1)<->v2153(VarCurr,bitIndex263))& (v2261(VarCurr,bitIndex0)<->v2153(VarCurr,bitIndex262))).
% 121.63/120.66  all VarCurr (v2259(VarCurr)<->v2167(VarCurr,bitIndex1)).
% 121.63/120.66  all VarCurr B (range_7_0(B)-> (v2163(VarCurr,B)<->v2255(VarCurr,B))).
% 121.63/120.66  all VarCurr (-v2165(VarCurr)-> (all B (range_130_0(B)-> (v2255(VarCurr,B)<->v2246(VarCurr,B))))).
% 121.63/120.66  all VarCurr (v2165(VarCurr)-> (all B (range_130_0(B)-> (v2255(VarCurr,B)<->v2234(VarCurr,B))))).
% 121.63/120.66  all VarCurr ((v2246(VarCurr,bitIndex7)<->v2153(VarCurr,bitIndex400))& (v2246(VarCurr,bitIndex6)<->v2153(VarCurr,bitIndex399))& (v2246(VarCurr,bitIndex5)<->v2153(VarCurr,bitIndex398))& (v2246(VarCurr,bitIndex4)<->v2153(VarCurr,bitIndex397))& (v2246(VarCurr,bitIndex3)<->v2153(VarCurr,bitIndex396))& (v2246(VarCurr,bitIndex2)<->v2153(VarCurr,bitIndex395))& (v2246(VarCurr,bitIndex1)<->v2153(VarCurr,bitIndex394))& (v2246(VarCurr,bitIndex0)<->v2153(VarCurr,bitIndex393))).
% 121.63/120.66  -v2153(constB0,bitIndex523).
% 121.63/120.66  -v2153(constB0,bitIndex522).
% 121.63/120.66  -v2153(constB0,bitIndex521).
% 121.63/120.66  -v2153(constB0,bitIndex520).
% 121.63/120.66  -v2153(constB0,bitIndex519).
% 121.63/120.66  -v2153(constB0,bitIndex518).
% 121.63/120.66  -v2153(constB0,bitIndex517).
% 121.63/120.66  -v2153(constB0,bitIndex400).
% 121.63/120.66  -v2153(constB0,bitIndex399).
% 121.63/120.66  -v2153(constB0,bitIndex398).
% 121.63/120.66  -v2153(constB0,bitIndex397).
% 121.63/120.66  -v2153(constB0,bitIndex396).
% 121.63/120.66  -v2153(constB0,bitIndex395).
% 121.63/120.66  -v2153(constB0,bitIndex394).
% 121.63/120.66  -v2153(constB0,bitIndex393).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex130).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex129).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex128).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex127).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex126).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex125).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex124).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.63/120.66  -v2153(constB0,bitIndex392).
% 121.63/120.66  -v2153(constB0,bitIndex391).
% 121.63/120.66  -v2153(constB0,bitIndex390).
% 121.63/120.66  -v2153(constB0,bitIndex389).
% 121.63/120.66  -v2153(constB0,bitIndex388).
% 121.63/120.66  -v2153(constB0,bitIndex387).
% 121.63/120.66  -v2153(constB0,bitIndex386).
% 121.63/120.66  -v2153(constB0,bitIndex269).
% 121.63/120.66  -v2153(constB0,bitIndex268).
% 121.63/120.66  -v2153(constB0,bitIndex267).
% 121.63/120.66  -v2153(constB0,bitIndex266).
% 121.63/120.66  -v2153(constB0,bitIndex265).
% 121.63/120.66  -v2153(constB0,bitIndex264).
% 121.63/120.66  -v2153(constB0,bitIndex263).
% 121.63/120.66  -v2153(constB0,bitIndex262).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex130).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex129).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex128).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex127).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex126).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex125).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex124).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.63/120.66  -v2153(constB0,bitIndex261).
% 121.63/120.66  -v2153(constB0,bitIndex260).
% 121.63/120.66  -v2153(constB0,bitIndex259).
% 121.63/120.66  -v2153(constB0,bitIndex258).
% 121.63/120.66  -v2153(constB0,bitIndex257).
% 121.63/120.66  -v2153(constB0,bitIndex256).
% 121.63/120.66  -v2153(constB0,bitIndex255).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex130).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex129).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex128).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex127).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex126).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex125).
% 121.63/120.66  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex124).
% 121.63/120.67  -v2153(constB0,bitIndex130).
% 121.63/120.67  -v2153(constB0,bitIndex129).
% 121.63/120.67  -v2153(constB0,bitIndex128).
% 121.63/120.67  -v2153(constB0,bitIndex127).
% 121.63/120.67  -v2153(constB0,bitIndex126).
% 121.63/120.67  -v2153(constB0,bitIndex125).
% 121.63/120.67  -v2153(constB0,bitIndex124).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex130).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex129).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex128).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex127).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex126).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex125).
% 121.63/120.67  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex124).
% 121.63/120.67  all VarCurr B (range_7_0(B)-> (v2234(VarCurr,B)<->v2236(VarCurr,B))).
% 121.63/120.67  all VarCurr B (range_7_0(B)-> (v2236(VarCurr,B)<->v2238(VarCurr,B))).
% 121.63/120.67  all VarCurr B (range_7_0(B)-> (v2238(VarCurr,B)<->v2240(VarCurr,B))).
% 121.63/120.67  all VarCurr B (range_7_0(B)-> (v2240(VarCurr,B)<->v2243(VarCurr,B))).
% 121.63/120.67  all VarCurr B (range_7_0(B)-> (v2243(VarCurr,B)<->v2242(VarCurr,B))).
% 121.63/120.67  all VarCurr ((v2243(VarCurr,bitIndex130)<->v2244(VarCurr,bitIndex122))& (v2243(VarCurr,bitIndex129)<->v2244(VarCurr,bitIndex121))& (v2243(VarCurr,bitIndex128)<->v2244(VarCurr,bitIndex120))& (v2243(VarCurr,bitIndex127)<->v2244(VarCurr,bitIndex119))& (v2243(VarCurr,bitIndex126)<->v2244(VarCurr,bitIndex118))& (v2243(VarCurr,bitIndex125)<->v2244(VarCurr,bitIndex117))& (v2243(VarCurr,bitIndex124)<->v2244(VarCurr,bitIndex116))& (v2243(VarCurr,bitIndex123)<->v2244(VarCurr,bitIndex115))& (v2243(VarCurr,bitIndex122)<->v2244(VarCurr,bitIndex114))& (v2243(VarCurr,bitIndex121)<->v2244(VarCurr,bitIndex113))& (v2243(VarCurr,bitIndex120)<->v2244(VarCurr,bitIndex112))& (v2243(VarCurr,bitIndex119)<->v2244(VarCurr,bitIndex111))& (v2243(VarCurr,bitIndex118)<->v2244(VarCurr,bitIndex110))& (v2243(VarCurr,bitIndex117)<->v2244(VarCurr,bitIndex109))& (v2243(VarCurr,bitIndex116)<->v2244(VarCurr,bitIndex108))& (v2243(VarCurr,bitIndex115)<->v2244(VarCurr,bitIndex107))& (v2243(VarCurr,bitIndex114)<->v2244(VarCurr,bitIndex106))& (v2243(VarCurr,bitIndex113)<->v2244(VarCurr,bitIndex105))& (v2243(VarCurr,bitIndex112)<->v2244(VarCurr,bitIndex104))& (v2243(VarCurr,bitIndex111)<->v2244(VarCurr,bitIndex103))& (v2243(VarCurr,bitIndex110)<->v2244(VarCurr,bitIndex102))& (v2243(VarCurr,bitIndex109)<->v2244(VarCurr,bitIndex101))& (v2243(VarCurr,bitIndex108)<->v2244(VarCurr,bitIndex100))& (v2243(VarCurr,bitIndex107)<->v2244(VarCurr,bitIndex99))& (v2243(VarCurr,bitIndex106)<->v2244(VarCurr,bitIndex98))& (v2243(VarCurr,bitIndex105)<->v2244(VarCurr,bitIndex97))& (v2243(VarCurr,bitIndex104)<->v2244(VarCurr,bitIndex96))& (v2243(VarCurr,bitIndex103)<->v2244(VarCurr,bitIndex95))& (v2243(VarCurr,bitIndex102)<->v2244(VarCurr,bitIndex94))& (v2243(VarCurr,bitIndex101)<->v2244(VarCurr,bitIndex93))& (v2243(VarCurr,bitIndex100)<->v2244(VarCurr,bitIndex92))& (v2243(VarCurr,bitIndex99)<->v2244(VarCurr,bitIndex91))& (v2243(VarCurr,bitIndex98)<->v2244(VarCurr,bitIndex90))& (v2243(VarCurr,bitIndex97)<->v2244(VarCurr,bitIndex89))& (v2243(VarCurr,bitIndex96)<->v2244(VarCurr,bitIndex88))& (v2243(VarCurr,bitIndex95)<->v2244(VarCurr,bitIndex87))& (v2243(VarCurr,bitIndex94)<->v2244(VarCurr,bitIndex86))& (v2243(VarCurr,bitIndex93)<->v2244(VarCurr,bitIndex85))& (v2243(VarCurr,bitIndex92)<->v2244(VarCurr,bitIndex84))& (v2243(VarCurr,bitIndex91)<->v2244(VarCurr,bitIndex83))& (v2243(VarCurr,bitIndex90)<->v2244(VarCurr,bitIndex82))& (v2243(VarCurr,bitIndex89)<->v2244(VarCurr,bitIndex81))& (v2243(VarCurr,bitIndex88)<->v2244(VarCurr,bitIndex80))& (v2243(VarCurr,bitIndex87)<->v2244(VarCurr,bitIndex79))& (v2243(VarCurr,bitIndex86)<->v2244(VarCurr,bitIndex78))& (v2243(VarCurr,bitIndex85)<->v2244(VarCurr,bitIndex77))& (v2243(VarCurr,bitIndex84)<->v2244(VarCurr,bitIndex76))& (v2243(VarCurr,bitIndex83)<->v2244(VarCurr,bitIndex75))& (v2243(VarCurr,bitIndex82)<->v2244(VarCurr,bitIndex74))& (v2243(VarCurr,bitIndex81)<->v2244(VarCurr,bitIndex73))& (v2243(VarCurr,bitIndex80)<->v2244(VarCurr,bitIndex72))& (v2243(VarCurr,bitIndex79)<->v2244(VarCurr,bitIndex71))& (v2243(VarCurr,bitIndex78)<->v2244(VarCurr,bitIndex70))& (v2243(VarCurr,bitIndex77)<->v2244(VarCurr,bitIndex69))& (v2243(VarCurr,bitIndex76)<->v2244(VarCurr,bitIndex68))& (v2243(VarCurr,bitIndex75)<->v2244(VarCurr,bitIndex67))& (v2243(VarCurr,bitIndex74)<->v2244(VarCurr,bitIndex66))& (v2243(VarCurr,bitIndex73)<->v2244(VarCurr,bitIndex65))& (v2243(VarCurr,bitIndex72)<->v2244(VarCurr,bitIndex64))& (v2243(VarCurr,bitIndex71)<->v2244(VarCurr,bitIndex63))& (v2243(VarCurr,bitIndex70)<->v2244(VarCurr,bitIndex62))& (v2243(VarCurr,bitIndex69)<->v2244(VarCurr,bitIndex61))& (v2243(VarCurr,bitIndex68)<->v2244(VarCurr,bitIndex60))& (v2243(VarCurr,bitIndex67)<->v2244(VarCurr,bitIndex59))& (v2243(VarCurr,bitIndex66)<->v2244(VarCurr,bitIndex58))& (v2243(VarCurr,bitIndex65)<->v2244(VarCurr,bitIndex57))& (v2243(VarCurr,bitIndex64)<->v2244(VarCurr,bitIndex56))& (v2243(VarCurr,bitIndex63)<->v2244(VarCurr,bitIndex55))& (v2243(VarCurr,bitIndex62)<->v2244(VarCurr,bitIndex54))& (v2243(VarCurr,bitIndex61)<->v2244(VarCurr,bitIndex53))& (v2243(VarCurr,bitIndex60)<->v2244(VarCurr,bitIndex52))& (v2243(VarCurr,bitIndex59)<->v2244(VarCurr,bitIndex51))& (v2243(VarCurr,bitIndex58)<->v2244(VarCurr,bitIndex50))& (v2243(VarCurr,bitIndex57)<->v2244(VarCurr,bitIndex49))& (v2243(VarCurr,bitIndex56)<->v2244(VarCurr,bitIndex48))& (v2243(VarCurr,bitIndex55)<->v2244(VarCurr,bitIndex47))& (v2243(VarCurr,bitIndex54)<->v2244(VarCurr,bitIndex46))& (v2243(VarCurr,bitIndex53)<->v2244(VarCurr,bitIndex45))& (v2243(VarCurr,bitIndex52)<->v2244(VarCurr,bitIndex44))& (v2243(VarCurr,bitIndex51)<->v2244(VarCurr,bitIndex43))& (v2243(VarCurr,bitIndex50)<->v2244(VarCurr,bitIndex42))& (v2243(VarCurr,bitIndex49)<->v2244(VarCurr,bitIndex41))& (v2243(VarCurr,bitIndex48)<->v2244(VarCurr,bitIndex40))& (v2243(VarCurr,bitIndex47)<->v2244(VarCurr,bitIndex39))& (v2243(VarCurr,bitIndex46)<->v2244(VarCurr,bitIndex38))& (v2243(VarCurr,bitIndex45)<->v2244(VarCurr,bitIndex37))& (v2243(VarCurr,bitIndex44)<->v2244(VarCurr,bitIndex36))& (v2243(VarCurr,bitIndex43)<->v2244(VarCurr,bitIndex35))& (v2243(VarCurr,bitIndex42)<->v2244(VarCurr,bitIndex34))& (v2243(VarCurr,bitIndex41)<->v2244(VarCurr,bitIndex33))& (v2243(VarCurr,bitIndex40)<->v2244(VarCurr,bitIndex32))& (v2243(VarCurr,bitIndex39)<->v2244(VarCurr,bitIndex31))& (v2243(VarCurr,bitIndex38)<->v2244(VarCurr,bitIndex30))& (v2243(VarCurr,bitIndex37)<->v2244(VarCurr,bitIndex29))& (v2243(VarCurr,bitIndex36)<->v2244(VarCurr,bitIndex28))& (v2243(VarCurr,bitIndex35)<->v2244(VarCurr,bitIndex27))& (v2243(VarCurr,bitIndex34)<->v2244(VarCurr,bitIndex26))& (v2243(VarCurr,bitIndex33)<->v2244(VarCurr,bitIndex25))& (v2243(VarCurr,bitIndex32)<->v2244(VarCurr,bitIndex24))& (v2243(VarCurr,bitIndex31)<->v2244(VarCurr,bitIndex23))& (v2243(VarCurr,bitIndex30)<->v2244(VarCurr,bitIndex22))& (v2243(VarCurr,bitIndex29)<->v2244(VarCurr,bitIndex21))& (v2243(VarCurr,bitIndex28)<->v2244(VarCurr,bitIndex20))& (v2243(VarCurr,bitIndex27)<->v2244(VarCurr,bitIndex19))& (v2243(VarCurr,bitIndex26)<->v2244(VarCurr,bitIndex18))& (v2243(VarCurr,bitIndex25)<->v2244(VarCurr,bitIndex17))& (v2243(VarCurr,bitIndex24)<->v2244(VarCurr,bitIndex16))& (v2243(VarCurr,bitIndex23)<->v2244(VarCurr,bitIndex15))& (v2243(VarCurr,bitIndex22)<->v2244(VarCurr,bitIndex14))& (v2243(VarCurr,bitIndex21)<->v2244(VarCurr,bitIndex13))& (v2243(VarCurr,bitIndex20)<->v2244(VarCurr,bitIndex12))& (v2243(VarCurr,bitIndex19)<->v2244(VarCurr,bitIndex11))& (v2243(VarCurr,bitIndex18)<->v2244(VarCurr,bitIndex10))& (v2243(VarCurr,bitIndex17)<->v2244(VarCurr,bitIndex9))& (v2243(VarCurr,bitIndex16)<->v2244(VarCurr,bitIndex8))& (v2243(VarCurr,bitIndex15)<->v2244(VarCurr,bitIndex7))& (v2243(VarCurr,bitIndex14)<->v2244(VarCurr,bitIndex6))& (v2243(VarCurr,bitIndex13)<->v2244(VarCurr,bitIndex5))& (v2243(VarCurr,bitIndex12)<->v2244(VarCurr,bitIndex4))& (v2243(VarCurr,bitIndex11)<->v2244(VarCurr,bitIndex3))& (v2243(VarCurr,bitIndex10)<->v2244(VarCurr,bitIndex2))& (v2243(VarCurr,bitIndex9)<->v2244(VarCurr,bitIndex1))& (v2243(VarCurr,bitIndex8)<->v2244(VarCurr,bitIndex0))).
% 121.63/120.67  all B (range_7_0(B)-> (v2242(constB0,B)<->$F)).
% 121.63/120.67  all VarCurr (v2165(VarCurr)<->v2167(VarCurr,bitIndex1)).
% 121.63/120.67  all VarCurr (v2167(VarCurr,bitIndex1)<->v2193(VarCurr,bitIndex1)).
% 121.63/120.67  all VarNext (v2171(VarNext,bitIndex0)<->v2226(VarNext,bitIndex0)).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2227(VarNext)-> (all B (range_6_0(B)-> (v2226(VarNext,B)<->v2171(VarCurr,B)))))).
% 121.63/120.67  all VarNext (v2227(VarNext)-> (all B (range_6_0(B)-> (v2226(VarNext,B)<->v2188(VarNext,B))))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2227(VarNext)<->v2228(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2228(VarNext)<->v2230(VarNext)&v2173(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2230(VarNext)<->v2182(VarNext))).
% 121.63/120.67  all VarCurr (v2167(VarCurr,bitIndex0)<->v2193(VarCurr,bitIndex0)).
% 121.63/120.67  all VarNext (v2171(VarNext,bitIndex2)<->v2218(VarNext,bitIndex2)).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2219(VarNext)-> (all B (range_6_0(B)-> (v2218(VarNext,B)<->v2171(VarCurr,B)))))).
% 121.63/120.67  all VarNext (v2219(VarNext)-> (all B (range_6_0(B)-> (v2218(VarNext,B)<->v2188(VarNext,B))))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2219(VarNext)<->v2220(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2220(VarNext)<->v2222(VarNext)&v2173(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2222(VarNext)<->v2182(VarNext))).
% 121.63/120.67  all VarCurr (v2167(VarCurr,bitIndex2)<->v2193(VarCurr,bitIndex2)).
% 121.63/120.67  all VarNext (v2171(VarNext,bitIndex3)<->v2210(VarNext,bitIndex3)).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2211(VarNext)-> (all B (range_6_0(B)-> (v2210(VarNext,B)<->v2171(VarCurr,B)))))).
% 121.63/120.67  all VarNext (v2211(VarNext)-> (all B (range_6_0(B)-> (v2210(VarNext,B)<->v2188(VarNext,B))))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2211(VarNext)<->v2212(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2212(VarNext)<->v2214(VarNext)&v2173(VarNext))).
% 121.63/120.67  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2214(VarNext)<->v2182(VarNext))).
% 121.63/120.67  all VarCurr (v2167(VarCurr,bitIndex3)<->v2193(VarCurr,bitIndex3)).
% 121.63/120.67  all VarCurr (-v2194(VarCurr)-> (all B (range_6_0(B)-> (v2193(VarCurr,B)<->v2196(VarCurr,B))))).
% 121.63/120.67  all VarCurr (v2194(VarCurr)-> (all B (range_6_0(B)-> (v2193(VarCurr,B)<->v2195(VarCurr,B))))).
% 121.63/120.67  all VarCurr (-v2197(VarCurr)& -v2199(VarCurr)& -v2203(VarCurr)-> (all B (range_6_0(B)-> (v2196(VarCurr,B)<->v2171(VarCurr,B))))).
% 121.63/120.67  all VarCurr (v2203(VarCurr)-> (all B (range_6_0(B)-> (v2196(VarCurr,B)<->v2205(VarCurr,B))))).
% 121.63/120.67  all VarCurr (v2199(VarCurr)-> (all B (range_6_0(B)-> (v2196(VarCurr,B)<->v2201(VarCurr,B))))).
% 121.63/120.67  all VarCurr (v2197(VarCurr)-> (all B (range_6_0(B)-> (v2196(VarCurr,B)<->v2171(VarCurr,B))))).
% 121.63/120.67  all VarCurr (v2207(VarCurr)<-> (v2208(VarCurr,bitIndex1)<->$T)& (v2208(VarCurr,bitIndex0)<->$T)).
% 121.63/120.67  all VarCurr (v2208(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.67  all VarCurr (v2208(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.67  all VarCurr (v2205(VarCurr,bitIndex0)<->$F).
% 121.63/120.67  all VarCurr ((v2205(VarCurr,bitIndex6)<->v2171(VarCurr,bitIndex5))& (v2205(VarCurr,bitIndex5)<->v2171(VarCurr,bitIndex4))& (v2205(VarCurr,bitIndex4)<->v2171(VarCurr,bitIndex3))& (v2205(VarCurr,bitIndex3)<->v2171(VarCurr,bitIndex2))& (v2205(VarCurr,bitIndex2)<->v2171(VarCurr,bitIndex1))& (v2205(VarCurr,bitIndex1)<->v2171(VarCurr,bitIndex0))).
% 121.63/120.67  all VarCurr (v2203(VarCurr)<-> (v2204(VarCurr,bitIndex1)<->$T)& (v2204(VarCurr,bitIndex0)<->$F)).
% 121.63/120.67  all VarCurr (v2204(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.67  all VarCurr (v2204(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.68  all VarCurr ((v2201(VarCurr,bitIndex5)<->v2171(VarCurr,bitIndex6))& (v2201(VarCurr,bitIndex4)<->v2171(VarCurr,bitIndex5))& (v2201(VarCurr,bitIndex3)<->v2171(VarCurr,bitIndex4))& (v2201(VarCurr,bitIndex2)<->v2171(VarCurr,bitIndex3))& (v2201(VarCurr,bitIndex1)<->v2171(VarCurr,bitIndex2))& (v2201(VarCurr,bitIndex0)<->v2171(VarCurr,bitIndex1))).
% 121.63/120.68  all VarCurr (v2201(VarCurr,bitIndex6)<->$F).
% 121.63/120.68  all VarCurr (v2199(VarCurr)<-> (v2200(VarCurr,bitIndex1)<->$F)& (v2200(VarCurr,bitIndex0)<->$T)).
% 121.63/120.68  all VarCurr (v2200(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.68  all VarCurr (v2200(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.68  all VarCurr (v2197(VarCurr)<-> (v2198(VarCurr,bitIndex1)<->$F)& (v2198(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v2198(VarCurr,bitIndex0)<->v2155(VarCurr)).
% 121.63/120.68  all VarCurr (v2198(VarCurr,bitIndex1)<->v29(VarCurr)).
% 121.63/120.68  all VarCurr (v2195(VarCurr,bitIndex0)<->$T).
% 121.63/120.68  all VarCurr B (range_6_1(B)-> (v2195(VarCurr,B)<->v2169(VarCurr,B))).
% 121.63/120.68  all VarCurr (-v2194(VarCurr)<->v27(VarCurr)).
% 121.63/120.68  all VarCurr (v2169(VarCurr,bitIndex3)<->v2192(VarCurr,bitIndex3)).
% 121.63/120.68  all VarCurr (v2169(VarCurr,bitIndex2)<->v2192(VarCurr,bitIndex2)).
% 121.63/120.68  all VarCurr (v2169(VarCurr,bitIndex1)<->v2192(VarCurr,bitIndex1)).
% 121.63/120.68  all VarCurr (v2192(VarCurr,bitIndex0)<->$T).
% 121.63/120.68  all VarCurr B (range_6_1(B)-> (v2192(VarCurr,B)<->v2171(VarCurr,B))).
% 121.63/120.68  all VarNext (v2171(VarNext,bitIndex1)<->v2177(VarNext,bitIndex1)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2178(VarNext)-> (all B (range_6_0(B)-> (v2177(VarNext,B)<->v2171(VarCurr,B)))))).
% 121.63/120.68  all VarNext (v2178(VarNext)-> (all B (range_6_0(B)-> (v2177(VarNext,B)<->v2188(VarNext,B))))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_6_0(B)-> (v2188(VarNext,B)<->v2186(VarCurr,B))))).
% 121.63/120.68  all VarCurr (-v2189(VarCurr)-> (all B (range_6_0(B)-> (v2186(VarCurr,B)<->v2167(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2189(VarCurr)-> (all B (range_6_0(B)-> (v2186(VarCurr,B)<->b0000001(B))))).
% 121.63/120.68  all VarCurr (-v2189(VarCurr)<->v27(VarCurr)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2178(VarNext)<->v2179(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2179(VarNext)<->v2180(VarNext)&v2173(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2180(VarNext)<->v2182(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2182(VarNext)<->v2173(VarCurr))).
% 121.63/120.68  -v2171(constB0,bitIndex4).
% 121.63/120.68  -v2171(constB0,bitIndex3).
% 121.63/120.68  -v2171(constB0,bitIndex2).
% 121.63/120.68  -v2171(constB0,bitIndex1).
% 121.63/120.68  v2171(constB0,bitIndex0).
% 121.63/120.68  -bxx00001(bitIndex4).
% 121.63/120.68  -bxx00001(bitIndex3).
% 121.63/120.68  -bxx00001(bitIndex2).
% 121.63/120.68  -bxx00001(bitIndex1).
% 121.63/120.68  bxx00001(bitIndex0).
% 121.63/120.68  all VarCurr (v2173(VarCurr)<->v1252(VarCurr)).
% 121.63/120.68  all VarCurr (v2155(VarCurr)<->v2157(VarCurr)).
% 121.63/120.68  all VarCurr (-v2159(VarCurr)-> (v2157(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v2159(VarCurr)-> (v2157(VarCurr)<->$T)).
% 121.63/120.68  all VarCurr (v2159(VarCurr)<->v2160(VarCurr)|v1323(VarCurr)).
% 121.63/120.68  all VarCurr (v2160(VarCurr)<->v2161(VarCurr)&v1322(VarCurr)).
% 121.63/120.68  all VarCurr (v2161(VarCurr)<->v2053(VarCurr)&v1320(VarCurr)).
% 121.63/120.68  all VarCurr (v2139(VarCurr)<->v2141(VarCurr)|v6(VarCurr,bitIndex4)).
% 121.63/120.68  all VarCurr (v2141(VarCurr)<->v6(VarCurr,bitIndex2)|v6(VarCurr,bitIndex3)).
% 121.63/120.68  all VarCurr (v2068(VarCurr)<->v2070(VarCurr,bitIndex2)).
% 121.63/120.68  all VarCurr (v2070(VarCurr,bitIndex2)<->v2098(VarCurr,bitIndex2)).
% 121.63/120.68  all VarNext (v2074(VarNext,bitIndex1)<->v2123(VarNext,bitIndex1)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2124(VarNext)-> (all B (range_6_0(B)-> (v2123(VarNext,B)<->v2074(VarCurr,B)))))).
% 121.63/120.68  all VarNext (v2124(VarNext)-> (all B (range_6_0(B)-> (v2123(VarNext,B)<->v2092(VarNext,B))))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2124(VarNext)<->v2125(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2125(VarNext)<->v2127(VarNext)&v2077(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2127(VarNext)<->v2086(VarNext))).
% 121.63/120.68  all VarCurr (v2070(VarCurr,bitIndex1)<->v2098(VarCurr,bitIndex1)).
% 121.63/120.68  all VarNext (v2074(VarNext,bitIndex0)<->v2115(VarNext,bitIndex0)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2116(VarNext)-> (all B (range_6_0(B)-> (v2115(VarNext,B)<->v2074(VarCurr,B)))))).
% 121.63/120.68  all VarNext (v2116(VarNext)-> (all B (range_6_0(B)-> (v2115(VarNext,B)<->v2092(VarNext,B))))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2116(VarNext)<->v2117(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2117(VarNext)<->v2119(VarNext)&v2077(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2119(VarNext)<->v2086(VarNext))).
% 121.63/120.68  all VarCurr (v2070(VarCurr,bitIndex0)<->v2098(VarCurr,bitIndex0)).
% 121.63/120.68  all VarCurr (-v2099(VarCurr)-> (all B (range_6_0(B)-> (v2098(VarCurr,B)<->v2101(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2099(VarCurr)-> (all B (range_6_0(B)-> (v2098(VarCurr,B)<->v2100(VarCurr,B))))).
% 121.63/120.68  all VarCurr (-v2102(VarCurr)& -v2104(VarCurr)& -v2108(VarCurr)-> (all B (range_6_0(B)-> (v2101(VarCurr,B)<->v2074(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2108(VarCurr)-> (all B (range_6_0(B)-> (v2101(VarCurr,B)<->v2110(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2104(VarCurr)-> (all B (range_6_0(B)-> (v2101(VarCurr,B)<->v2106(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2102(VarCurr)-> (all B (range_6_0(B)-> (v2101(VarCurr,B)<->v2074(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2112(VarCurr)<-> (v2113(VarCurr,bitIndex1)<->$T)& (v2113(VarCurr,bitIndex0)<->$T)).
% 121.63/120.68  all VarCurr (v2113(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.68  all VarCurr (v2113(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.68  all VarCurr (v2110(VarCurr,bitIndex0)<->$F).
% 121.63/120.68  all VarCurr ((v2110(VarCurr,bitIndex6)<->v2074(VarCurr,bitIndex5))& (v2110(VarCurr,bitIndex5)<->v2074(VarCurr,bitIndex4))& (v2110(VarCurr,bitIndex4)<->v2074(VarCurr,bitIndex3))& (v2110(VarCurr,bitIndex3)<->v2074(VarCurr,bitIndex2))& (v2110(VarCurr,bitIndex2)<->v2074(VarCurr,bitIndex1))& (v2110(VarCurr,bitIndex1)<->v2074(VarCurr,bitIndex0))).
% 121.63/120.68  all VarCurr (v2108(VarCurr)<-> (v2109(VarCurr,bitIndex1)<->$T)& (v2109(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v2109(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.68  all VarCurr (v2109(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.68  all VarCurr ((v2106(VarCurr,bitIndex5)<->v2074(VarCurr,bitIndex6))& (v2106(VarCurr,bitIndex4)<->v2074(VarCurr,bitIndex5))& (v2106(VarCurr,bitIndex3)<->v2074(VarCurr,bitIndex4))& (v2106(VarCurr,bitIndex2)<->v2074(VarCurr,bitIndex3))& (v2106(VarCurr,bitIndex1)<->v2074(VarCurr,bitIndex2))& (v2106(VarCurr,bitIndex0)<->v2074(VarCurr,bitIndex1))).
% 121.63/120.68  all VarCurr (v2106(VarCurr,bitIndex6)<->$F).
% 121.63/120.68  all VarCurr (v2104(VarCurr)<-> (v2105(VarCurr,bitIndex1)<->$F)& (v2105(VarCurr,bitIndex0)<->$T)).
% 121.63/120.68  all VarCurr (v2105(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.68  all VarCurr (v2105(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.68  all VarCurr (v2102(VarCurr)<-> (v2103(VarCurr,bitIndex1)<->$F)& (v2103(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v2103(VarCurr,bitIndex0)<->v2064(VarCurr)).
% 121.63/120.68  all VarCurr (v2103(VarCurr,bitIndex1)<->v1185(VarCurr)).
% 121.63/120.68  all VarCurr (v2100(VarCurr,bitIndex0)<->$T).
% 121.63/120.68  all VarCurr B (range_6_1(B)-> (v2100(VarCurr,B)<->v2072(VarCurr,B))).
% 121.63/120.68  all VarCurr (-v2099(VarCurr)<->v1183(VarCurr)).
% 121.63/120.68  all VarCurr (v2072(VarCurr,bitIndex1)<->v2096(VarCurr,bitIndex1)).
% 121.63/120.68  all VarCurr (v2072(VarCurr,bitIndex2)<->v2096(VarCurr,bitIndex2)).
% 121.63/120.68  all VarCurr (v2096(VarCurr,bitIndex0)<->$T).
% 121.63/120.68  all VarCurr B (range_6_1(B)-> (v2096(VarCurr,B)<->v2074(VarCurr,B))).
% 121.63/120.68  all B (range_6_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B).
% 121.63/120.68  all VarNext (v2074(VarNext,bitIndex2)<->v2081(VarNext,bitIndex2)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2082(VarNext)-> (all B (range_6_0(B)-> (v2081(VarNext,B)<->v2074(VarCurr,B)))))).
% 121.63/120.68  all VarNext (v2082(VarNext)-> (all B (range_6_0(B)-> (v2081(VarNext,B)<->v2092(VarNext,B))))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_6_0(B)-> (v2092(VarNext,B)<->v2090(VarCurr,B))))).
% 121.63/120.68  all VarCurr (-v2093(VarCurr)-> (all B (range_6_0(B)-> (v2090(VarCurr,B)<->v2070(VarCurr,B))))).
% 121.63/120.68  all VarCurr (v2093(VarCurr)-> (all B (range_6_0(B)-> (v2090(VarCurr,B)<->b0000001(B))))).
% 121.63/120.68  -b0000001(bitIndex6).
% 121.63/120.68  -b0000001(bitIndex5).
% 121.63/120.68  -b0000001(bitIndex4).
% 121.63/120.68  -b0000001(bitIndex3).
% 121.63/120.68  -b0000001(bitIndex2).
% 121.63/120.68  -b0000001(bitIndex1).
% 121.63/120.68  b0000001(bitIndex0).
% 121.63/120.68  all VarCurr (-v2093(VarCurr)<->v1183(VarCurr)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2082(VarNext)<->v2083(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2083(VarNext)<->v2084(VarNext)&v2077(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2084(VarNext)<->v2086(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2086(VarNext)<->v2077(VarCurr))).
% 121.63/120.68  -v2074(constB0,bitIndex3).
% 121.63/120.68  -v2074(constB0,bitIndex2).
% 121.63/120.68  -v2074(constB0,bitIndex1).
% 121.63/120.68  v2074(constB0,bitIndex0).
% 121.63/120.68  -bxxx0001(bitIndex3).
% 121.63/120.68  -bxxx0001(bitIndex2).
% 121.63/120.68  -bxxx0001(bitIndex1).
% 121.63/120.68  bxxx0001(bitIndex0).
% 121.63/120.68  all VarCurr (v2077(VarCurr)<->v801(VarCurr)).
% 121.63/120.68  all VarCurr (v2064(VarCurr)<->v1204(VarCurr)).
% 121.63/120.68  all VarCurr (v1185(VarCurr)<->v1187(VarCurr)).
% 121.63/120.68  all VarCurr (v1187(VarCurr)<->v1189(VarCurr)).
% 121.63/120.68  all VarCurr (v1189(VarCurr)<->v1191(VarCurr)).
% 121.63/120.68  all VarCurr (-v2048(VarCurr)-> (v1191(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v2048(VarCurr)-> (v1191(VarCurr)<->$T)).
% 121.63/120.68  all VarCurr (v2048(VarCurr)<->v2049(VarCurr)|v2059(VarCurr)).
% 121.63/120.68  all VarCurr (v2059(VarCurr)<->v2060(VarCurr)&v1348(VarCurr)).
% 121.63/120.68  all VarCurr (v2060(VarCurr)<->v2061(VarCurr)|v1346(VarCurr)).
% 121.63/120.68  all VarCurr (v2061(VarCurr)<-> (v2062(VarCurr,bitIndex1)<->$T)& (v2062(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v2062(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.63/120.68  all VarCurr (v2062(VarCurr,bitIndex1)<->v1193(VarCurr)).
% 121.63/120.68  all VarCurr (v2049(VarCurr)<->v2050(VarCurr)|v2058(VarCurr)).
% 121.63/120.68  all VarCurr (v2058(VarCurr)<->v1342(VarCurr)&v1344(VarCurr)).
% 121.63/120.68  all VarCurr (v2050(VarCurr)<->v2051(VarCurr)|v2054(VarCurr)).
% 121.63/120.68  all VarCurr (v2054(VarCurr)<->v2055(VarCurr)&v1333(VarCurr)).
% 121.63/120.68  all VarCurr (v2055(VarCurr)<->v2056(VarCurr)|v1331(VarCurr)).
% 121.63/120.68  all VarCurr (v2056(VarCurr)<-> (v2057(VarCurr,bitIndex1)<->$T)& (v2057(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v2057(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.63/120.68  all VarCurr (v2057(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.63/120.68  all VarCurr (v2051(VarCurr)<->v2052(VarCurr)&v1322(VarCurr)).
% 121.63/120.68  all VarCurr (v2052(VarCurr)<->v2053(VarCurr)&v1320(VarCurr)).
% 121.63/120.68  all VarCurr (v2053(VarCurr)<-> (v21(VarCurr,bitIndex1)<->$F)& (v21(VarCurr,bitIndex0)<->$F)).
% 121.63/120.68  all VarCurr (v1272(VarCurr)<->v1274(VarCurr)).
% 121.63/120.68  all VarCurr (v1274(VarCurr)<->v1276(VarCurr)).
% 121.63/120.68  all VarCurr (v1276(VarCurr)<->v1278(VarCurr)).
% 121.63/120.68  all VarCurr (v1278(VarCurr)<->v1280(VarCurr)).
% 121.63/120.68  all VarCurr (v1280(VarCurr)<->v1282(VarCurr)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2035(VarNext)-> (v1282(VarNext)<->v1282(VarCurr)))).
% 121.63/120.68  all VarNext (v2035(VarNext)-> (v1282(VarNext)<->v2043(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2043(VarNext)<->v2041(VarCurr))).
% 121.63/120.68  all VarCurr (-v2044(VarCurr)-> (v2041(VarCurr)<->v1288(VarCurr))).
% 121.63/120.68  all VarCurr (v2044(VarCurr)-> (v2041(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (-v2044(VarCurr)<->v1284(VarCurr)).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2035(VarNext)<->v2036(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2036(VarNext)<->v2037(VarNext)&v1942(VarNext))).
% 121.63/120.68  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2037(VarNext)<->v1949(VarNext))).
% 121.63/120.68  all VarCurr (-v1994(VarCurr)-> (v1288(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v1994(VarCurr)-> (v1288(VarCurr)<->v2018(VarCurr))).
% 121.63/120.68  all VarCurr (-v1400(VarCurr)-> (v2018(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v1400(VarCurr)-> (v2018(VarCurr)<->v2019(VarCurr))).
% 121.63/120.68  all VarCurr (v2025(VarCurr)<->v2027(VarCurr)|v2007(VarCurr)).
% 121.63/120.68  all VarCurr (v2027(VarCurr)<->v2028(VarCurr)|v2006(VarCurr)).
% 121.63/120.68  all VarCurr (v2028(VarCurr)<->v2029(VarCurr)|v2005(VarCurr)).
% 121.63/120.68  all VarCurr (v2029(VarCurr)<->v2030(VarCurr)|v1977(VarCurr)).
% 121.63/120.68  all VarCurr (v2030(VarCurr)<->v2031(VarCurr)|v1976(VarCurr)).
% 121.63/120.68  all VarCurr (v2031(VarCurr)<->v2032(VarCurr)|v1975(VarCurr)).
% 121.63/120.68  all VarCurr (v2032(VarCurr)<->v1962(VarCurr)|v1974(VarCurr)).
% 121.63/120.68  all VarCurr (v1962(VarCurr)<->v1963(VarCurr)|v1968(VarCurr)).
% 121.63/120.68  all VarCurr (-v1294(VarCurr)-> (v2019(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v1294(VarCurr)-> (v2019(VarCurr)<->v2020(VarCurr))).
% 121.63/120.68  all VarCurr (-v2021(VarCurr)-> (v2020(VarCurr)<->$T)).
% 121.63/120.68  all VarCurr (v2021(VarCurr)-> (v2020(VarCurr)<->$F)).
% 121.63/120.68  all VarCurr (v2021(VarCurr)<->v2022(VarCurr)&v1381(VarCurr)).
% 121.63/120.68  all VarCurr (v2022(VarCurr)<->v2023(VarCurr)|v2024(VarCurr)).
% 121.63/120.68  all VarCurr (v2024(VarCurr)<-> (v1364(VarCurr,bitIndex3)<->$T)& (v1364(VarCurr,bitIndex2)<->$T)& (v1364(VarCurr,bitIndex1)<->$F)& (v1364(VarCurr,bitIndex0)<->$T)).
% 121.63/120.68  all VarCurr (v2023(VarCurr)<-> (v1364(VarCurr,bitIndex3)<->$F)& (v1364(VarCurr,bitIndex2)<->$T)& (v1364(VarCurr,bitIndex1)<->$F)& (v1364(VarCurr,bitIndex0)<->$T)).
% 121.72/120.69  all VarCurr (v1994(VarCurr)<->v1995(VarCurr)|v2007(VarCurr)).
% 121.72/120.69  all VarCurr (-v2007(VarCurr)<->v2008(VarCurr)).
% 121.72/120.69  all VarCurr (v2008(VarCurr)<->v2009(VarCurr)|v1406(VarCurr)).
% 121.72/120.69  all VarCurr (v2009(VarCurr)<->v2010(VarCurr)|v1977(VarCurr)).
% 121.72/120.69  all VarCurr (v2010(VarCurr)<->v2011(VarCurr)|v1976(VarCurr)).
% 121.72/120.69  all VarCurr (v2011(VarCurr)<->v2012(VarCurr)|v1975(VarCurr)).
% 121.72/120.69  all VarCurr (v2012(VarCurr)<->v2013(VarCurr)|v1974(VarCurr)).
% 121.72/120.69  all VarCurr (v2013(VarCurr)<->v2014(VarCurr)|v1403(VarCurr)).
% 121.72/120.69  all VarCurr (v2014(VarCurr)<->v2015(VarCurr)|v1968(VarCurr)).
% 121.72/120.69  all VarCurr (v2015(VarCurr)<->v2016(VarCurr)|v1967(VarCurr)).
% 121.72/120.69  all VarCurr (v2016(VarCurr)<->v2017(VarCurr)|v1966(VarCurr)).
% 121.72/120.69  all VarCurr (v2017(VarCurr)<->v1400(VarCurr)|v1965(VarCurr)).
% 121.72/120.69  all VarCurr (v1995(VarCurr)<->v1996(VarCurr)|v2006(VarCurr)).
% 121.72/120.69  all VarCurr (v2006(VarCurr)<->v1405(VarCurr)&v1406(VarCurr)).
% 121.72/120.69  all VarCurr (v1996(VarCurr)<->v1997(VarCurr)|v1977(VarCurr)).
% 121.72/120.69  all VarCurr (v1997(VarCurr)<->v1998(VarCurr)|v1976(VarCurr)).
% 121.72/120.69  all VarCurr (v1998(VarCurr)<->v1999(VarCurr)|v1975(VarCurr)).
% 121.72/120.69  all VarCurr (v1999(VarCurr)<->v2000(VarCurr)|v1974(VarCurr)).
% 121.72/120.69  all VarCurr (v2000(VarCurr)<->v2001(VarCurr)|v2005(VarCurr)).
% 121.72/120.69  all VarCurr (v2005(VarCurr)<->v1402(VarCurr)&v1403(VarCurr)).
% 121.72/120.69  all VarCurr (v2001(VarCurr)<->v2002(VarCurr)|v1968(VarCurr)).
% 121.72/120.69  all VarCurr (v2002(VarCurr)<->v2003(VarCurr)|v1967(VarCurr)).
% 121.72/120.69  all VarCurr (v2003(VarCurr)<->v2004(VarCurr)|v1966(VarCurr)).
% 121.72/120.69  all VarCurr (v2004(VarCurr)<->v1400(VarCurr)|v1965(VarCurr)).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1981(VarNext)-> (all B (range_3_0(B)-> (v1290(VarNext,B)<->v1290(VarCurr,B)))))).
% 121.72/120.69  all VarNext (v1981(VarNext)-> (all B (range_3_0(B)-> (v1290(VarNext,B)<->v1989(VarNext,B))))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1989(VarNext,B)<->v1987(VarCurr,B))))).
% 121.72/120.69  all VarCurr (-v1990(VarCurr)-> (all B (range_3_0(B)-> (v1987(VarCurr,B)<->v1292(VarCurr,B))))).
% 121.72/120.69  all VarCurr (v1990(VarCurr)-> (all B (range_3_0(B)-> (v1987(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (-v1990(VarCurr)<->v1284(VarCurr)).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1981(VarNext)<->v1982(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1982(VarNext)<->v1983(VarNext)&v1942(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1983(VarNext)<->v1949(VarNext))).
% 121.72/120.69  all VarCurr (-v1400(VarCurr)& -v1961(VarCurr)& -v1403(VarCurr)& -v1970(VarCurr)& -v1406(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1406(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->v1978(VarCurr,B))))).
% 121.72/120.69  all VarCurr (v1970(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1403(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->v1969(VarCurr,B))))).
% 121.72/120.69  all VarCurr (v1961(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1400(VarCurr)-> (all B (range_3_0(B)-> (v1292(VarCurr,B)<->v1960(VarCurr,B))))).
% 121.72/120.69  all VarCurr (-v1405(VarCurr)-> (all B (range_3_0(B)-> (v1978(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1405(VarCurr)-> (all B (range_3_0(B)-> (v1978(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1970(VarCurr)<->v1972(VarCurr)|v1977(VarCurr)).
% 121.72/120.69  all VarCurr (v1977(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$T)& (v1290(VarCurr,bitIndex2)<->$T)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$F)).
% 121.72/120.69  all VarCurr (v1972(VarCurr)<->v1973(VarCurr)|v1976(VarCurr)).
% 121.72/120.69  all VarCurr (v1976(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$T)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$T)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.69  all VarCurr (v1973(VarCurr)<->v1974(VarCurr)|v1975(VarCurr)).
% 121.72/120.69  all VarCurr (v1975(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$T)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$T)& (v1290(VarCurr,bitIndex0)<->$F)).
% 121.72/120.69  all VarCurr (v1974(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$T)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.69  all VarCurr (-v1402(VarCurr)-> (all B (range_3_0(B)-> (v1969(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1402(VarCurr)-> (all B (range_3_0(B)-> (v1969(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1961(VarCurr)<->v1963(VarCurr)|v1968(VarCurr)).
% 121.72/120.69  all VarCurr (v1968(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$T)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$F)).
% 121.72/120.69  all VarCurr (v1963(VarCurr)<->v1964(VarCurr)|v1967(VarCurr)).
% 121.72/120.69  all VarCurr (v1967(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$T)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.69  all VarCurr (v1964(VarCurr)<->v1965(VarCurr)|v1966(VarCurr)).
% 121.72/120.69  all VarCurr (v1966(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$T)& (v1290(VarCurr,bitIndex0)<->$F)).
% 121.72/120.69  all VarCurr (v1965(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.69  all VarCurr (-v1294(VarCurr)-> (all B (range_3_0(B)-> (v1960(VarCurr,B)<->$F)))).
% 121.72/120.69  all VarCurr (v1294(VarCurr)-> (all B (range_3_0(B)-> (v1960(VarCurr,B)<->v1364(VarCurr,B))))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1945(VarNext)-> (v1379(VarNext)<->v1379(VarCurr)))).
% 121.72/120.69  all VarNext (v1945(VarNext)-> (v1379(VarNext)<->v1955(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1955(VarNext)<->v1953(VarCurr))).
% 121.72/120.69  all VarCurr (-v1956(VarCurr)-> (v1953(VarCurr)<->v1381(VarCurr))).
% 121.72/120.69  all VarCurr (v1956(VarCurr)-> (v1953(VarCurr)<->$F)).
% 121.72/120.69  all VarCurr (-v1956(VarCurr)<->v1284(VarCurr)).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1945(VarNext)<->v1946(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1946(VarNext)<->v1947(VarNext)&v1942(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1947(VarNext)<->v1949(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1949(VarNext)<->v1942(VarCurr))).
% 121.72/120.69  all VarCurr (v1942(VarCurr)<->v1624(VarCurr)).
% 121.72/120.69  all VarCurr (v1381(VarCurr)<->v1383(VarCurr)).
% 121.72/120.69  all VarCurr (v1383(VarCurr)<->v1385(VarCurr)).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1920(VarNext)-> (v1385(VarNext)<->v1385(VarCurr)))).
% 121.72/120.69  all VarNext (v1920(VarNext)-> (v1385(VarNext)<->v1937(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1937(VarNext)<->v1935(VarCurr))).
% 121.72/120.69  all VarCurr (-v1929(VarCurr)-> (v1935(VarCurr)<->v1938(VarCurr))).
% 121.72/120.69  all VarCurr (v1929(VarCurr)-> (v1935(VarCurr)<->$F)).
% 121.72/120.69  all VarCurr (-v1389(VarCurr)-> (v1938(VarCurr)<->$F)).
% 121.72/120.69  all VarCurr (v1389(VarCurr)-> (v1938(VarCurr)<->v1654(VarCurr))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1920(VarNext)<->v1921(VarNext)&v1928(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1928(VarNext)<->v1926(VarCurr))).
% 121.72/120.69  all VarCurr (v1926(VarCurr)<->v1929(VarCurr)|v1930(VarCurr)).
% 121.72/120.69  all VarCurr (v1930(VarCurr)<->v1931(VarCurr)&v1934(VarCurr)).
% 121.72/120.69  all VarCurr (-v1934(VarCurr)<->v1929(VarCurr)).
% 121.72/120.69  all VarCurr (v1931(VarCurr)<->v1389(VarCurr)|v1932(VarCurr)).
% 121.72/120.69  all VarCurr (v1932(VarCurr)<->v1410(VarCurr)&v1933(VarCurr)).
% 121.72/120.69  all VarCurr (-v1933(VarCurr)<->v1389(VarCurr)).
% 121.72/120.69  all VarCurr (-v1929(VarCurr)<->v1387(VarCurr)).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1921(VarNext)<->v1922(VarNext)&v1868(VarNext))).
% 121.72/120.69  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1922(VarNext)<->v1875(VarNext))).
% 121.72/120.69  all VarCurr (-v1654(VarCurr)<->v1886(VarCurr)).
% 121.72/120.69  all VarCurr (v1886(VarCurr)<->v1888(VarCurr)|v1903(VarCurr)).
% 121.72/120.69  all VarCurr (v1903(VarCurr)<->v1904(VarCurr)|v1911(VarCurr)).
% 121.72/120.69  all VarCurr (v1911(VarCurr)<->v1912(VarCurr)|v1915(VarCurr)).
% 121.72/120.69  all VarCurr (v1915(VarCurr)<->v1916(VarCurr)|v1917(VarCurr)).
% 121.72/120.69  all VarCurr (v1917(VarCurr)<->v1656(VarCurr,bitIndex30)|v1656(VarCurr,bitIndex31)).
% 121.72/120.69  all VarCurr (v1916(VarCurr)<->v1656(VarCurr,bitIndex28)|v1656(VarCurr,bitIndex29)).
% 121.72/120.69  all VarCurr (v1912(VarCurr)<->v1913(VarCurr)|v1914(VarCurr)).
% 121.72/120.69  all VarCurr (v1914(VarCurr)<->v1656(VarCurr,bitIndex26)|v1656(VarCurr,bitIndex27)).
% 121.72/120.69  all VarCurr (v1913(VarCurr)<->v1656(VarCurr,bitIndex24)|v1656(VarCurr,bitIndex25)).
% 121.72/120.69  all VarCurr (v1904(VarCurr)<->v1905(VarCurr)|v1908(VarCurr)).
% 121.72/120.69  all VarCurr (v1908(VarCurr)<->v1909(VarCurr)|v1910(VarCurr)).
% 121.72/120.70  all VarCurr (v1910(VarCurr)<->v1656(VarCurr,bitIndex22)|v1656(VarCurr,bitIndex23)).
% 121.72/120.70  all VarCurr (v1909(VarCurr)<->v1656(VarCurr,bitIndex20)|v1656(VarCurr,bitIndex21)).
% 121.72/120.70  all VarCurr (v1905(VarCurr)<->v1906(VarCurr)|v1907(VarCurr)).
% 121.72/120.70  all VarCurr (v1907(VarCurr)<->v1656(VarCurr,bitIndex18)|v1656(VarCurr,bitIndex19)).
% 121.72/120.70  all VarCurr (v1906(VarCurr)<->v1656(VarCurr,bitIndex16)|v1656(VarCurr,bitIndex17)).
% 121.72/120.70  all VarCurr (v1888(VarCurr)<->v1889(VarCurr)|v1896(VarCurr)).
% 121.72/120.70  all VarCurr (v1896(VarCurr)<->v1897(VarCurr)|v1900(VarCurr)).
% 121.72/120.70  all VarCurr (v1900(VarCurr)<->v1901(VarCurr)|v1902(VarCurr)).
% 121.72/120.70  all VarCurr (v1902(VarCurr)<->v1656(VarCurr,bitIndex14)|v1656(VarCurr,bitIndex15)).
% 121.72/120.70  all VarCurr (v1901(VarCurr)<->v1656(VarCurr,bitIndex12)|v1656(VarCurr,bitIndex13)).
% 121.72/120.70  all VarCurr (v1897(VarCurr)<->v1898(VarCurr)|v1899(VarCurr)).
% 121.72/120.70  all VarCurr (v1899(VarCurr)<->v1656(VarCurr,bitIndex10)|v1656(VarCurr,bitIndex11)).
% 121.72/120.70  all VarCurr (v1898(VarCurr)<->v1656(VarCurr,bitIndex8)|v1656(VarCurr,bitIndex9)).
% 121.72/120.70  all VarCurr (v1889(VarCurr)<->v1890(VarCurr)|v1893(VarCurr)).
% 121.72/120.70  all VarCurr (v1893(VarCurr)<->v1894(VarCurr)|v1895(VarCurr)).
% 121.72/120.70  all VarCurr (v1895(VarCurr)<->v1656(VarCurr,bitIndex6)|v1656(VarCurr,bitIndex7)).
% 121.72/120.70  all VarCurr (v1894(VarCurr)<->v1656(VarCurr,bitIndex4)|v1656(VarCurr,bitIndex5)).
% 121.72/120.70  all VarCurr (v1890(VarCurr)<->v1891(VarCurr)|v1892(VarCurr)).
% 121.72/120.70  all VarCurr (v1892(VarCurr)<->v1656(VarCurr,bitIndex2)|v1656(VarCurr,bitIndex3)).
% 121.72/120.70  all VarCurr (v1891(VarCurr)<->v1656(VarCurr,bitIndex0)|v1656(VarCurr,bitIndex1)).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1871(VarNext)-> (all B (range_31_0(B)-> (v1656(VarNext,B)<->v1656(VarCurr,B)))))).
% 121.72/120.70  all VarNext (v1871(VarNext)-> (all B (range_31_0(B)-> (v1656(VarNext,B)<->v1881(VarNext,B))))).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_31_0(B)-> (v1881(VarNext,B)<->v1879(VarCurr,B))))).
% 121.72/120.70  all VarCurr (-v1882(VarCurr)-> (all B (range_31_0(B)-> (v1879(VarCurr,B)<->v1659(VarCurr,B))))).
% 121.72/120.70  all VarCurr (v1882(VarCurr)-> (all B (range_31_0(B)-> (v1879(VarCurr,B)<->b11111111111111111111111111111110(B))))).
% 121.72/120.70  all VarCurr (-v1882(VarCurr)<->v1387(VarCurr)).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1871(VarNext)<->v1872(VarNext))).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1872(VarNext)<->v1873(VarNext)&v1868(VarNext))).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1873(VarNext)<->v1875(VarNext))).
% 121.72/120.70  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1875(VarNext)<->v1868(VarCurr))).
% 121.72/120.70  all VarCurr (v1868(VarCurr)<->v1624(VarCurr)).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1659(VarCurr,B)<->v1865(VarCurr,B)&v1866(VarCurr,B))).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1866(VarCurr,B)<-> -v1853(VarCurr,B))).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1865(VarCurr,B)<->v1661(VarCurr,B)|v1656(VarCurr,B))).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1853(VarCurr,B)<->v1855(VarCurr,B)&v1862(VarCurr,B))).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex0)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex1)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex2)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex3)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex4)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex5)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex6)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex7)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex8)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex9)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex10)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex11)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex12)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex13)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex14)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex15)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex16)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex17)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex18)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex19)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex20)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex21)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex22)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex23)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex24)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex25)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex26)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex27)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex28)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex29)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex30)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1862(VarCurr,bitIndex31)<->v1863(VarCurr)).
% 121.72/120.70  all VarCurr (v1863(VarCurr)<->v1389(VarCurr)).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1855(VarCurr,B)<->v1656(VarCurr,B)&v1860(VarCurr,B))).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1860(VarCurr,B)<-> -v1857(VarCurr,B))).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex1)<->v1857(VarCurr,bitIndex0)|v1656(VarCurr,bitIndex0)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex2)<->v1857(VarCurr,bitIndex1)|v1656(VarCurr,bitIndex1)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex3)<->v1857(VarCurr,bitIndex2)|v1656(VarCurr,bitIndex2)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex4)<->v1857(VarCurr,bitIndex3)|v1656(VarCurr,bitIndex3)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex5)<->v1857(VarCurr,bitIndex4)|v1656(VarCurr,bitIndex4)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex6)<->v1857(VarCurr,bitIndex5)|v1656(VarCurr,bitIndex5)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex7)<->v1857(VarCurr,bitIndex6)|v1656(VarCurr,bitIndex6)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex8)<->v1857(VarCurr,bitIndex7)|v1656(VarCurr,bitIndex7)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex9)<->v1857(VarCurr,bitIndex8)|v1656(VarCurr,bitIndex8)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex10)<->v1857(VarCurr,bitIndex9)|v1656(VarCurr,bitIndex9)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex11)<->v1857(VarCurr,bitIndex10)|v1656(VarCurr,bitIndex10)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex12)<->v1857(VarCurr,bitIndex11)|v1656(VarCurr,bitIndex11)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex13)<->v1857(VarCurr,bitIndex12)|v1656(VarCurr,bitIndex12)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex14)<->v1857(VarCurr,bitIndex13)|v1656(VarCurr,bitIndex13)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex15)<->v1857(VarCurr,bitIndex14)|v1656(VarCurr,bitIndex14)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex16)<->v1857(VarCurr,bitIndex15)|v1656(VarCurr,bitIndex15)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex17)<->v1857(VarCurr,bitIndex16)|v1656(VarCurr,bitIndex16)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex18)<->v1857(VarCurr,bitIndex17)|v1656(VarCurr,bitIndex17)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex19)<->v1857(VarCurr,bitIndex18)|v1656(VarCurr,bitIndex18)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex20)<->v1857(VarCurr,bitIndex19)|v1656(VarCurr,bitIndex19)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex21)<->v1857(VarCurr,bitIndex20)|v1656(VarCurr,bitIndex20)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex22)<->v1857(VarCurr,bitIndex21)|v1656(VarCurr,bitIndex21)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex23)<->v1857(VarCurr,bitIndex22)|v1656(VarCurr,bitIndex22)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex24)<->v1857(VarCurr,bitIndex23)|v1656(VarCurr,bitIndex23)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex25)<->v1857(VarCurr,bitIndex24)|v1656(VarCurr,bitIndex24)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex26)<->v1857(VarCurr,bitIndex25)|v1656(VarCurr,bitIndex25)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex27)<->v1857(VarCurr,bitIndex26)|v1656(VarCurr,bitIndex26)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex28)<->v1857(VarCurr,bitIndex27)|v1656(VarCurr,bitIndex27)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex29)<->v1857(VarCurr,bitIndex28)|v1656(VarCurr,bitIndex28)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex30)<->v1857(VarCurr,bitIndex29)|v1656(VarCurr,bitIndex29)).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex31)<->v1857(VarCurr,bitIndex30)|v1656(VarCurr,bitIndex30)).
% 121.72/120.70  v1656(constB0,bitIndex31).
% 121.72/120.70  v1656(constB0,bitIndex30).
% 121.72/120.70  v1656(constB0,bitIndex29).
% 121.72/120.70  v1656(constB0,bitIndex28).
% 121.72/120.70  v1656(constB0,bitIndex27).
% 121.72/120.70  v1656(constB0,bitIndex26).
% 121.72/120.70  v1656(constB0,bitIndex25).
% 121.72/120.70  v1656(constB0,bitIndex24).
% 121.72/120.70  v1656(constB0,bitIndex23).
% 121.72/120.70  v1656(constB0,bitIndex22).
% 121.72/120.70  v1656(constB0,bitIndex21).
% 121.72/120.70  v1656(constB0,bitIndex20).
% 121.72/120.70  v1656(constB0,bitIndex19).
% 121.72/120.70  v1656(constB0,bitIndex18).
% 121.72/120.70  v1656(constB0,bitIndex17).
% 121.72/120.70  v1656(constB0,bitIndex16).
% 121.72/120.70  v1656(constB0,bitIndex15).
% 121.72/120.70  v1656(constB0,bitIndex14).
% 121.72/120.70  v1656(constB0,bitIndex13).
% 121.72/120.70  v1656(constB0,bitIndex12).
% 121.72/120.70  v1656(constB0,bitIndex11).
% 121.72/120.70  v1656(constB0,bitIndex10).
% 121.72/120.70  v1656(constB0,bitIndex9).
% 121.72/120.70  v1656(constB0,bitIndex8).
% 121.72/120.70  v1656(constB0,bitIndex7).
% 121.72/120.70  v1656(constB0,bitIndex6).
% 121.72/120.70  v1656(constB0,bitIndex5).
% 121.72/120.70  v1656(constB0,bitIndex4).
% 121.72/120.70  v1656(constB0,bitIndex3).
% 121.72/120.70  v1656(constB0,bitIndex2).
% 121.72/120.70  v1656(constB0,bitIndex1).
% 121.72/120.70  -v1656(constB0,bitIndex0).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex31).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex30).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex29).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex28).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex27).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex26).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex25).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex24).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex23).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex22).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex21).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex20).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex19).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex18).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex17).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex16).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex15).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex14).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex13).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex12).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex11).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex10).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex9).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex8).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex7).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex6).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex5).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex4).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex3).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex2).
% 121.72/120.70  b11111111111111111111111111111110(bitIndex1).
% 121.72/120.70  -b11111111111111111111111111111110(bitIndex0).
% 121.72/120.70  all VarCurr (v1857(VarCurr,bitIndex0)<->$F).
% 121.72/120.70  all VarCurr (-v1848(VarCurr)-> (all B (range_31_0(B)-> (v1661(VarCurr,B)<->v1849(VarCurr,B))))).
% 121.72/120.70  all VarCurr (v1848(VarCurr)-> (all B (range_31_0(B)-> (v1661(VarCurr,B)<->$F)))).
% 121.72/120.70  all VarCurr B (range_31_0(B)-> (v1849(VarCurr,B)<->v1663(VarCurr,B)&v1850(VarCurr,B))).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex0)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex1)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex2)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex3)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex4)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex5)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex6)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex7)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex8)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex9)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex10)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex11)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex12)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex13)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex14)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex15)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex16)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex17)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex18)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex19)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex20)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex21)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex22)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex23)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex24)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex25)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex26)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex27)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex28)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex29)<->v1851(VarCurr)).
% 121.72/120.70  all VarCurr (v1850(VarCurr,bitIndex30)<->v1851(VarCurr)).
% 121.72/120.71  all VarCurr (v1850(VarCurr,bitIndex31)<->v1851(VarCurr)).
% 121.72/120.71  all VarCurr (v1851(VarCurr)<->v1410(VarCurr)).
% 121.72/120.71  all VarCurr (v1848(VarCurr)<->v1385(VarCurr)&v1410(VarCurr)).
% 121.72/120.71  v1385(constB0)<->$F.
% 121.72/120.71  all VarCurr (-v1846(VarCurr)-> (v1663(VarCurr,bitIndex31)<->$F)).
% 121.72/120.71  all VarCurr (v1846(VarCurr)-> (v1663(VarCurr,bitIndex31)<->v1782(VarCurr,bitIndex31))).
% 121.72/120.71  all VarCurr (v1846(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b11111(bitIndex4).
% 121.72/120.71  b11111(bitIndex3).
% 121.72/120.71  b11111(bitIndex2).
% 121.72/120.71  b11111(bitIndex1).
% 121.72/120.71  b11111(bitIndex0).
% 121.72/120.71  all VarCurr (-v1844(VarCurr)-> (v1663(VarCurr,bitIndex30)<->$F)).
% 121.72/120.71  all VarCurr (v1844(VarCurr)-> (v1663(VarCurr,bitIndex30)<->v1779(VarCurr,bitIndex30))).
% 121.72/120.71  all VarCurr (v1844(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b11110(bitIndex4).
% 121.72/120.71  b11110(bitIndex3).
% 121.72/120.71  b11110(bitIndex2).
% 121.72/120.71  b11110(bitIndex1).
% 121.72/120.71  -b11110(bitIndex0).
% 121.72/120.71  all VarCurr (-v1842(VarCurr)-> (v1663(VarCurr,bitIndex29)<->$F)).
% 121.72/120.71  all VarCurr (v1842(VarCurr)-> (v1663(VarCurr,bitIndex29)<->v1776(VarCurr,bitIndex29))).
% 121.72/120.71  all VarCurr (v1842(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b11101(bitIndex4).
% 121.72/120.71  b11101(bitIndex3).
% 121.72/120.71  b11101(bitIndex2).
% 121.72/120.71  -b11101(bitIndex1).
% 121.72/120.71  b11101(bitIndex0).
% 121.72/120.71  all VarCurr (-v1840(VarCurr)-> (v1663(VarCurr,bitIndex28)<->$F)).
% 121.72/120.71  all VarCurr (v1840(VarCurr)-> (v1663(VarCurr,bitIndex28)<->v1773(VarCurr,bitIndex28))).
% 121.72/120.71  all VarCurr (v1840(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b11100(bitIndex4).
% 121.72/120.71  b11100(bitIndex3).
% 121.72/120.71  b11100(bitIndex2).
% 121.72/120.71  -b11100(bitIndex1).
% 121.72/120.71  -b11100(bitIndex0).
% 121.72/120.71  all VarCurr (-v1838(VarCurr)-> (v1663(VarCurr,bitIndex27)<->$F)).
% 121.72/120.71  all VarCurr (v1838(VarCurr)-> (v1663(VarCurr,bitIndex27)<->v1770(VarCurr,bitIndex27))).
% 121.72/120.71  all VarCurr (v1838(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b11011(bitIndex4).
% 121.72/120.71  b11011(bitIndex3).
% 121.72/120.71  -b11011(bitIndex2).
% 121.72/120.71  b11011(bitIndex1).
% 121.72/120.71  b11011(bitIndex0).
% 121.72/120.71  all VarCurr (-v1836(VarCurr)-> (v1663(VarCurr,bitIndex26)<->$F)).
% 121.72/120.71  all VarCurr (v1836(VarCurr)-> (v1663(VarCurr,bitIndex26)<->v1767(VarCurr,bitIndex26))).
% 121.72/120.71  all VarCurr (v1836(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b11010(bitIndex4).
% 121.72/120.71  b11010(bitIndex3).
% 121.72/120.71  -b11010(bitIndex2).
% 121.72/120.71  b11010(bitIndex1).
% 121.72/120.71  -b11010(bitIndex0).
% 121.72/120.71  all VarCurr (-v1834(VarCurr)-> (v1663(VarCurr,bitIndex25)<->$F)).
% 121.72/120.71  all VarCurr (v1834(VarCurr)-> (v1663(VarCurr,bitIndex25)<->v1764(VarCurr,bitIndex25))).
% 121.72/120.71  all VarCurr (v1834(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b11001(bitIndex4).
% 121.72/120.71  b11001(bitIndex3).
% 121.72/120.71  -b11001(bitIndex2).
% 121.72/120.71  -b11001(bitIndex1).
% 121.72/120.71  b11001(bitIndex0).
% 121.72/120.71  all VarCurr (-v1832(VarCurr)-> (v1663(VarCurr,bitIndex24)<->$F)).
% 121.72/120.71  all VarCurr (v1832(VarCurr)-> (v1663(VarCurr,bitIndex24)<->v1761(VarCurr,bitIndex24))).
% 121.72/120.71  all VarCurr (v1832(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b11000(bitIndex4).
% 121.72/120.71  b11000(bitIndex3).
% 121.72/120.71  -b11000(bitIndex2).
% 121.72/120.71  -b11000(bitIndex1).
% 121.72/120.71  -b11000(bitIndex0).
% 121.72/120.71  all VarCurr (-v1830(VarCurr)-> (v1663(VarCurr,bitIndex23)<->$F)).
% 121.72/120.71  all VarCurr (v1830(VarCurr)-> (v1663(VarCurr,bitIndex23)<->v1758(VarCurr,bitIndex23))).
% 121.72/120.71  all VarCurr (v1830(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b10111(bitIndex4).
% 121.72/120.71  -b10111(bitIndex3).
% 121.72/120.71  b10111(bitIndex2).
% 121.72/120.71  b10111(bitIndex1).
% 121.72/120.71  b10111(bitIndex0).
% 121.72/120.71  all VarCurr (-v1828(VarCurr)-> (v1663(VarCurr,bitIndex22)<->$F)).
% 121.72/120.71  all VarCurr (v1828(VarCurr)-> (v1663(VarCurr,bitIndex22)<->v1755(VarCurr,bitIndex22))).
% 121.72/120.71  all VarCurr (v1828(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b10110(bitIndex4).
% 121.72/120.71  -b10110(bitIndex3).
% 121.72/120.71  b10110(bitIndex2).
% 121.72/120.71  b10110(bitIndex1).
% 121.72/120.71  -b10110(bitIndex0).
% 121.72/120.71  all VarCurr (-v1826(VarCurr)-> (v1663(VarCurr,bitIndex21)<->$F)).
% 121.72/120.71  all VarCurr (v1826(VarCurr)-> (v1663(VarCurr,bitIndex21)<->v1752(VarCurr,bitIndex21))).
% 121.72/120.71  all VarCurr (v1826(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b10101(bitIndex4).
% 121.72/120.71  -b10101(bitIndex3).
% 121.72/120.71  b10101(bitIndex2).
% 121.72/120.71  -b10101(bitIndex1).
% 121.72/120.71  b10101(bitIndex0).
% 121.72/120.71  all VarCurr (-v1824(VarCurr)-> (v1663(VarCurr,bitIndex20)<->$F)).
% 121.72/120.71  all VarCurr (v1824(VarCurr)-> (v1663(VarCurr,bitIndex20)<->v1749(VarCurr,bitIndex20))).
% 121.72/120.71  all VarCurr (v1824(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b10100(bitIndex4).
% 121.72/120.71  -b10100(bitIndex3).
% 121.72/120.71  b10100(bitIndex2).
% 121.72/120.71  -b10100(bitIndex1).
% 121.72/120.71  -b10100(bitIndex0).
% 121.72/120.71  all VarCurr (-v1822(VarCurr)-> (v1663(VarCurr,bitIndex19)<->$F)).
% 121.72/120.71  all VarCurr (v1822(VarCurr)-> (v1663(VarCurr,bitIndex19)<->v1746(VarCurr,bitIndex19))).
% 121.72/120.71  all VarCurr (v1822(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b10011(bitIndex4).
% 121.72/120.71  -b10011(bitIndex3).
% 121.72/120.71  -b10011(bitIndex2).
% 121.72/120.71  b10011(bitIndex1).
% 121.72/120.71  b10011(bitIndex0).
% 121.72/120.71  all VarCurr (-v1820(VarCurr)-> (v1663(VarCurr,bitIndex18)<->$F)).
% 121.72/120.71  all VarCurr (v1820(VarCurr)-> (v1663(VarCurr,bitIndex18)<->v1743(VarCurr,bitIndex18))).
% 121.72/120.71  all VarCurr (v1820(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b10010(bitIndex4).
% 121.72/120.71  -b10010(bitIndex3).
% 121.72/120.71  -b10010(bitIndex2).
% 121.72/120.71  b10010(bitIndex1).
% 121.72/120.71  -b10010(bitIndex0).
% 121.72/120.71  all VarCurr (-v1818(VarCurr)-> (v1663(VarCurr,bitIndex17)<->$F)).
% 121.72/120.71  all VarCurr (v1818(VarCurr)-> (v1663(VarCurr,bitIndex17)<->v1740(VarCurr,bitIndex17))).
% 121.72/120.71  all VarCurr (v1818(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  b10001(bitIndex4).
% 121.72/120.71  -b10001(bitIndex3).
% 121.72/120.71  -b10001(bitIndex2).
% 121.72/120.71  -b10001(bitIndex1).
% 121.72/120.71  b10001(bitIndex0).
% 121.72/120.71  all VarCurr (-v1816(VarCurr)-> (v1663(VarCurr,bitIndex16)<->$F)).
% 121.72/120.71  all VarCurr (v1816(VarCurr)-> (v1663(VarCurr,bitIndex16)<->v1737(VarCurr,bitIndex16))).
% 121.72/120.71  all VarCurr (v1816(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$T)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  b10000(bitIndex4).
% 121.72/120.71  -b10000(bitIndex3).
% 121.72/120.71  -b10000(bitIndex2).
% 121.72/120.71  -b10000(bitIndex1).
% 121.72/120.71  -b10000(bitIndex0).
% 121.72/120.71  all VarCurr (-v1814(VarCurr)-> (v1663(VarCurr,bitIndex15)<->$F)).
% 121.72/120.71  all VarCurr (v1814(VarCurr)-> (v1663(VarCurr,bitIndex15)<->v1734(VarCurr,bitIndex15))).
% 121.72/120.71  all VarCurr (v1814(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.71  -b01111(bitIndex4).
% 121.72/120.71  b01111(bitIndex3).
% 121.72/120.71  b01111(bitIndex2).
% 121.72/120.71  b01111(bitIndex1).
% 121.72/120.71  b01111(bitIndex0).
% 121.72/120.71  all VarCurr (-v1812(VarCurr)-> (v1663(VarCurr,bitIndex14)<->$F)).
% 121.72/120.71  all VarCurr (v1812(VarCurr)-> (v1663(VarCurr,bitIndex14)<->v1731(VarCurr,bitIndex14))).
% 121.72/120.71  all VarCurr (v1812(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.71  -b01110(bitIndex4).
% 121.72/120.71  b01110(bitIndex3).
% 121.72/120.72  b01110(bitIndex2).
% 121.72/120.72  b01110(bitIndex1).
% 121.72/120.72  -b01110(bitIndex0).
% 121.72/120.72  all VarCurr (-v1810(VarCurr)-> (v1663(VarCurr,bitIndex13)<->$F)).
% 121.72/120.72  all VarCurr (v1810(VarCurr)-> (v1663(VarCurr,bitIndex13)<->v1728(VarCurr,bitIndex13))).
% 121.72/120.72  all VarCurr (v1810(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b01101(bitIndex4).
% 121.72/120.72  b01101(bitIndex3).
% 121.72/120.72  b01101(bitIndex2).
% 121.72/120.72  -b01101(bitIndex1).
% 121.72/120.72  b01101(bitIndex0).
% 121.72/120.72  all VarCurr (-v1808(VarCurr)-> (v1663(VarCurr,bitIndex12)<->$F)).
% 121.72/120.72  all VarCurr (v1808(VarCurr)-> (v1663(VarCurr,bitIndex12)<->v1725(VarCurr,bitIndex12))).
% 121.72/120.72  all VarCurr (v1808(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b01100(bitIndex4).
% 121.72/120.72  b01100(bitIndex3).
% 121.72/120.72  b01100(bitIndex2).
% 121.72/120.72  -b01100(bitIndex1).
% 121.72/120.72  -b01100(bitIndex0).
% 121.72/120.72  all VarCurr (-v1806(VarCurr)-> (v1663(VarCurr,bitIndex11)<->$F)).
% 121.72/120.72  all VarCurr (v1806(VarCurr)-> (v1663(VarCurr,bitIndex11)<->v1722(VarCurr,bitIndex11))).
% 121.72/120.72  all VarCurr (v1806(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b01011(bitIndex4).
% 121.72/120.72  b01011(bitIndex3).
% 121.72/120.72  -b01011(bitIndex2).
% 121.72/120.72  b01011(bitIndex1).
% 121.72/120.72  b01011(bitIndex0).
% 121.72/120.72  all VarCurr (-v1804(VarCurr)-> (v1663(VarCurr,bitIndex10)<->$F)).
% 121.72/120.72  all VarCurr (v1804(VarCurr)-> (v1663(VarCurr,bitIndex10)<->v1719(VarCurr,bitIndex10))).
% 121.72/120.72  all VarCurr (v1804(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b01010(bitIndex4).
% 121.72/120.72  b01010(bitIndex3).
% 121.72/120.72  -b01010(bitIndex2).
% 121.72/120.72  b01010(bitIndex1).
% 121.72/120.72  -b01010(bitIndex0).
% 121.72/120.72  all VarCurr (-v1802(VarCurr)-> (v1663(VarCurr,bitIndex9)<->$F)).
% 121.72/120.72  all VarCurr (v1802(VarCurr)-> (v1663(VarCurr,bitIndex9)<->v1716(VarCurr,bitIndex9))).
% 121.72/120.72  all VarCurr (v1802(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b01001(bitIndex4).
% 121.72/120.72  b01001(bitIndex3).
% 121.72/120.72  -b01001(bitIndex2).
% 121.72/120.72  -b01001(bitIndex1).
% 121.72/120.72  b01001(bitIndex0).
% 121.72/120.72  all VarCurr (-v1800(VarCurr)-> (v1663(VarCurr,bitIndex8)<->$F)).
% 121.72/120.72  all VarCurr (v1800(VarCurr)-> (v1663(VarCurr,bitIndex8)<->v1713(VarCurr,bitIndex8))).
% 121.72/120.72  all VarCurr (v1800(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$T)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b01000(bitIndex4).
% 121.72/120.72  b01000(bitIndex3).
% 121.72/120.72  -b01000(bitIndex2).
% 121.72/120.72  -b01000(bitIndex1).
% 121.72/120.72  -b01000(bitIndex0).
% 121.72/120.72  all VarCurr (-v1798(VarCurr)-> (v1663(VarCurr,bitIndex7)<->$F)).
% 121.72/120.72  all VarCurr (v1798(VarCurr)-> (v1663(VarCurr,bitIndex7)<->v1710(VarCurr,bitIndex7))).
% 121.72/120.72  all VarCurr (v1798(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b00111(bitIndex4).
% 121.72/120.72  -b00111(bitIndex3).
% 121.72/120.72  b00111(bitIndex2).
% 121.72/120.72  b00111(bitIndex1).
% 121.72/120.72  b00111(bitIndex0).
% 121.72/120.72  all VarCurr (-v1796(VarCurr)-> (v1663(VarCurr,bitIndex6)<->$F)).
% 121.72/120.72  all VarCurr (v1796(VarCurr)-> (v1663(VarCurr,bitIndex6)<->v1707(VarCurr,bitIndex6))).
% 121.72/120.72  all VarCurr (v1796(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b00110(bitIndex4).
% 121.72/120.72  -b00110(bitIndex3).
% 121.72/120.72  b00110(bitIndex2).
% 121.72/120.72  b00110(bitIndex1).
% 121.72/120.72  -b00110(bitIndex0).
% 121.72/120.72  all VarCurr (-v1794(VarCurr)-> (v1663(VarCurr,bitIndex5)<->$F)).
% 121.72/120.72  all VarCurr (v1794(VarCurr)-> (v1663(VarCurr,bitIndex5)<->v1704(VarCurr,bitIndex5))).
% 121.72/120.72  all VarCurr (v1794(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b00101(bitIndex4).
% 121.72/120.72  -b00101(bitIndex3).
% 121.72/120.72  b00101(bitIndex2).
% 121.72/120.72  -b00101(bitIndex1).
% 121.72/120.72  b00101(bitIndex0).
% 121.72/120.72  all VarCurr (-v1792(VarCurr)-> (v1663(VarCurr,bitIndex4)<->$F)).
% 121.72/120.72  all VarCurr (v1792(VarCurr)-> (v1663(VarCurr,bitIndex4)<->v1701(VarCurr,bitIndex4))).
% 121.72/120.72  all VarCurr (v1792(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$T)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b00100(bitIndex4).
% 121.72/120.72  -b00100(bitIndex3).
% 121.72/120.72  b00100(bitIndex2).
% 121.72/120.72  -b00100(bitIndex1).
% 121.72/120.72  -b00100(bitIndex0).
% 121.72/120.72  all VarCurr (-v1790(VarCurr)-> (v1663(VarCurr,bitIndex3)<->$F)).
% 121.72/120.72  all VarCurr (v1790(VarCurr)-> (v1663(VarCurr,bitIndex3)<->v1698(VarCurr,bitIndex3))).
% 121.72/120.72  all VarCurr (v1790(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b00011(bitIndex4).
% 121.72/120.72  -b00011(bitIndex3).
% 121.72/120.72  -b00011(bitIndex2).
% 121.72/120.72  b00011(bitIndex1).
% 121.72/120.72  b00011(bitIndex0).
% 121.72/120.72  all VarCurr (-v1788(VarCurr)-> (v1663(VarCurr,bitIndex2)<->$F)).
% 121.72/120.72  all VarCurr (v1788(VarCurr)-> (v1663(VarCurr,bitIndex2)<->v1695(VarCurr,bitIndex2))).
% 121.72/120.72  all VarCurr (v1788(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$T)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  -b00010(bitIndex4).
% 121.72/120.72  -b00010(bitIndex3).
% 121.72/120.72  -b00010(bitIndex2).
% 121.72/120.72  b00010(bitIndex1).
% 121.72/120.72  -b00010(bitIndex0).
% 121.72/120.72  all VarCurr (-v1786(VarCurr)-> (v1663(VarCurr,bitIndex1)<->$F)).
% 121.72/120.72  all VarCurr (v1786(VarCurr)-> (v1663(VarCurr,bitIndex1)<->v1693(VarCurr,bitIndex1))).
% 121.72/120.72  all VarCurr (v1786(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$T)).
% 121.72/120.72  -b00001(bitIndex4).
% 121.72/120.72  -b00001(bitIndex3).
% 121.72/120.72  -b00001(bitIndex2).
% 121.72/120.72  -b00001(bitIndex1).
% 121.72/120.72  b00001(bitIndex0).
% 121.72/120.72  all VarCurr (-v1784(VarCurr)-> (v1663(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  all VarCurr (v1784(VarCurr)-> (v1663(VarCurr,bitIndex0)<->v1691(VarCurr,bitIndex0))).
% 121.72/120.72  all VarCurr (v1784(VarCurr)<-> (v1665(VarCurr,bitIndex4)<->$F)& (v1665(VarCurr,bitIndex3)<->$F)& (v1665(VarCurr,bitIndex2)<->$F)& (v1665(VarCurr,bitIndex1)<->$F)& (v1665(VarCurr,bitIndex0)<->$F)).
% 121.72/120.72  all VarCurr (v1782(VarCurr,bitIndex31)<->$T).
% 121.72/120.72  all VarCurr (v1779(VarCurr,bitIndex30)<->$T).
% 121.72/120.72  all VarCurr (v1776(VarCurr,bitIndex29)<->$T).
% 121.72/120.72  all VarCurr (v1773(VarCurr,bitIndex28)<->$T).
% 121.72/120.72  all VarCurr (v1770(VarCurr,bitIndex27)<->$T).
% 121.72/120.72  all VarCurr (v1767(VarCurr,bitIndex26)<->$T).
% 121.72/120.72  all VarCurr (v1764(VarCurr,bitIndex25)<->$T).
% 121.72/120.72  all VarCurr (v1761(VarCurr,bitIndex24)<->$T).
% 121.72/120.72  all VarCurr (v1758(VarCurr,bitIndex23)<->$T).
% 121.72/120.72  all VarCurr (v1755(VarCurr,bitIndex22)<->$T).
% 121.72/120.72  all VarCurr (v1752(VarCurr,bitIndex21)<->$T).
% 121.72/120.72  all VarCurr (v1749(VarCurr,bitIndex20)<->$T).
% 121.72/120.72  all VarCurr (v1746(VarCurr,bitIndex19)<->$T).
% 121.72/120.72  all VarCurr (v1743(VarCurr,bitIndex18)<->$T).
% 121.72/120.72  all VarCurr (v1740(VarCurr,bitIndex17)<->$T).
% 121.72/120.72  all VarCurr (v1737(VarCurr,bitIndex16)<->$T).
% 121.72/120.72  all VarCurr (v1734(VarCurr,bitIndex15)<->$T).
% 121.72/120.72  all VarCurr (v1731(VarCurr,bitIndex14)<->$T).
% 121.72/120.72  all VarCurr (v1728(VarCurr,bitIndex13)<->$T).
% 121.72/120.72  all VarCurr (v1725(VarCurr,bitIndex12)<->$T).
% 121.72/120.72  all VarCurr (v1722(VarCurr,bitIndex11)<->$T).
% 121.72/120.72  all VarCurr (v1719(VarCurr,bitIndex10)<->$T).
% 121.72/120.72  all VarCurr (v1716(VarCurr,bitIndex9)<->$T).
% 121.72/120.72  all VarCurr (v1713(VarCurr,bitIndex8)<->$T).
% 121.72/120.72  all VarCurr (v1710(VarCurr,bitIndex7)<->$T).
% 121.72/120.72  all VarCurr (v1707(VarCurr,bitIndex6)<->$T).
% 121.72/120.72  all VarCurr (v1704(VarCurr,bitIndex5)<->$T).
% 121.72/120.72  all VarCurr (v1701(VarCurr,bitIndex4)<->$T).
% 121.72/120.72  all VarCurr (v1698(VarCurr,bitIndex3)<->$T).
% 121.72/120.72  all VarCurr (v1695(VarCurr,bitIndex2)<->$T).
% 121.72/120.72  all VarCurr (v1693(VarCurr,bitIndex1)<->$T).
% 121.72/120.72  all VarCurr (v1691(VarCurr,bitIndex0)<->$T).
% 121.72/120.72  all VarCurr B (range_4_0(B)-> (v1665(VarCurr,B)<->v1667(VarCurr,B))).
% 121.72/120.72  all VarCurr B (range_4_0(B)-> (v1667(VarCurr,B)<->v1669(VarCurr,B))).
% 121.72/120.72  all VarCurr B (range_4_0(B)-> (v1669(VarCurr,B)<->v1671(VarCurr,B))).
% 121.72/120.72  all VarCurr (-v1687(VarCurr)-> (all B (range_4_0(B)-> (v1671(VarCurr,B)<->v1675(VarCurr,B))))).
% 121.72/120.72  all VarCurr (v1687(VarCurr)-> (all B (range_4_0(B)-> (v1671(VarCurr,B)<->v1673(VarCurr,B))))).
% 121.72/120.72  all VarCurr (v1687(VarCurr)<->v1688(VarCurr)|v1689(VarCurr)).
% 121.72/120.72  all VarCurr (v1689(VarCurr)<-> (v1579(VarCurr,bitIndex3)<->$T)& (v1579(VarCurr,bitIndex2)<->$T)& (v1579(VarCurr,bitIndex1)<->$F)& (v1579(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  all VarCurr (v1688(VarCurr)<-> (v1579(VarCurr,bitIndex3)<->$F)& (v1579(VarCurr,bitIndex2)<->$T)& (v1579(VarCurr,bitIndex1)<->$F)& (v1579(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1675(VarCurr,B)<->v1677(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1677(VarCurr,B)<->v1679(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1679(VarCurr,B)<->v1681(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1681(VarCurr,B)<->v1683(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1683(VarCurr,B)<->v1685(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_4_0(B)-> (v1673(VarCurr,B)<->$F)).
% 121.72/120.73  all VarCurr (v1410(VarCurr)<->v1412(VarCurr)).
% 121.72/120.73  all VarCurr (v1412(VarCurr)<->v1414(VarCurr)).
% 121.72/120.73  all VarCurr (-v1642(VarCurr)-> (v1414(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1642(VarCurr)-> (v1414(VarCurr)<->v1648(VarCurr))).
% 121.72/120.73  all VarCurr (-v1646(VarCurr)-> (v1648(VarCurr)<->$T)).
% 121.72/120.73  all VarCurr (v1646(VarCurr)-> (v1648(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1649(VarCurr)<->v1651(VarCurr)|v1616(VarCurr)).
% 121.72/120.73  all VarCurr (v1651(VarCurr)<->v1652(VarCurr)|v1615(VarCurr)).
% 121.72/120.73  all VarCurr (v1652(VarCurr)<->v1604(VarCurr)|v1605(VarCurr)).
% 121.72/120.73  all VarCurr (v1642(VarCurr)<->v1643(VarCurr)|v1616(VarCurr)).
% 121.72/120.73  all VarCurr (v1643(VarCurr)<->v1644(VarCurr)|v1615(VarCurr)).
% 121.72/120.73  all VarCurr (v1644(VarCurr)<->v1645(VarCurr)|v1605(VarCurr)).
% 121.72/120.73  all VarCurr (v1645(VarCurr)<->v1646(VarCurr)|v1604(VarCurr)).
% 121.72/120.73  all VarCurr (v1646(VarCurr)<->v1647(VarCurr)&v1597(VarCurr)).
% 121.72/120.73  all VarCurr (-v1647(VarCurr)<->v1416(VarCurr)).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1627(VarNext)-> (all B (range_3_0(B)-> (v1573(VarNext,B)<->v1573(VarCurr,B)))))).
% 121.72/120.73  all VarNext (v1627(VarNext)-> (all B (range_3_0(B)-> (v1573(VarNext,B)<->v1637(VarNext,B))))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1637(VarNext,B)<->v1635(VarCurr,B))))).
% 121.72/120.73  all VarCurr (-v1638(VarCurr)-> (all B (range_3_0(B)-> (v1635(VarCurr,B)<->v1577(VarCurr,B))))).
% 121.72/120.73  all VarCurr (v1638(VarCurr)-> (all B (range_3_0(B)-> (v1635(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (-v1638(VarCurr)<->v1575(VarCurr)).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1627(VarNext)<->v1628(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1628(VarNext)<->v1629(VarNext)&v1622(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1629(VarNext)<->v1631(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1631(VarNext)<->v1622(VarCurr))).
% 121.72/120.73  all VarCurr (v1622(VarCurr)<->v1624(VarCurr)).
% 121.72/120.73  all VarCurr (v1624(VarCurr)<->v1(VarCurr)).
% 121.72/120.73  all VarCurr (-v1597(VarCurr)& -v1599(VarCurr)& -v1607(VarCurr)& -v1610(VarCurr)& -v1618(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1618(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->v1619(VarCurr,B))))).
% 121.72/120.73  all VarCurr (v1610(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1607(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->v1608(VarCurr,B))))).
% 121.72/120.73  all VarCurr (v1599(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1597(VarCurr)-> (all B (range_3_0(B)-> (v1577(VarCurr,B)<->v1598(VarCurr,B))))).
% 121.72/120.73  all VarCurr (-v1620(VarCurr)-> (all B (range_3_0(B)-> (v1619(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1620(VarCurr)-> (all B (range_3_0(B)-> (v1619(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (-v1620(VarCurr)<->v1591(VarCurr)).
% 121.72/120.73  all VarCurr (v1618(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$T)& (v1573(VarCurr,bitIndex2)<->$T)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  all VarCurr (v1610(VarCurr)<->v1612(VarCurr)|v1617(VarCurr)).
% 121.72/120.73  all VarCurr (v1617(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$T)& (v1573(VarCurr,bitIndex2)<->$T)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  b1100(bitIndex3).
% 121.72/120.73  b1100(bitIndex2).
% 121.72/120.73  -b1100(bitIndex1).
% 121.72/120.73  -b1100(bitIndex0).
% 121.72/120.73  all VarCurr (v1612(VarCurr)<->v1613(VarCurr)|v1616(VarCurr)).
% 121.72/120.73  all VarCurr (v1616(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$T)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$T)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  b1011(bitIndex3).
% 121.72/120.73  -b1011(bitIndex2).
% 121.72/120.73  b1011(bitIndex1).
% 121.72/120.73  b1011(bitIndex0).
% 121.72/120.73  all VarCurr (v1613(VarCurr)<->v1614(VarCurr)|v1615(VarCurr)).
% 121.72/120.73  all VarCurr (v1615(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$T)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$T)& (v1573(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  b1010(bitIndex3).
% 121.72/120.73  -b1010(bitIndex2).
% 121.72/120.73  b1010(bitIndex1).
% 121.72/120.73  -b1010(bitIndex0).
% 121.72/120.73  all VarCurr (v1614(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$T)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  b1001(bitIndex3).
% 121.72/120.73  -b1001(bitIndex2).
% 121.72/120.73  -b1001(bitIndex1).
% 121.72/120.73  b1001(bitIndex0).
% 121.72/120.73  all VarCurr (-v1609(VarCurr)-> (all B (range_3_0(B)-> (v1608(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1609(VarCurr)-> (all B (range_3_0(B)-> (v1608(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (-v1609(VarCurr)<->v1591(VarCurr)).
% 121.72/120.73  v1591(constB0)<->$F.
% 121.72/120.73  all VarCurr (v1607(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$T)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  all VarCurr (v1599(VarCurr)<->v1601(VarCurr)|v1606(VarCurr)).
% 121.72/120.73  all VarCurr (v1606(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$T)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  -b0100(bitIndex3).
% 121.72/120.73  b0100(bitIndex2).
% 121.72/120.73  -b0100(bitIndex1).
% 121.72/120.73  -b0100(bitIndex0).
% 121.72/120.73  all VarCurr (v1601(VarCurr)<->v1602(VarCurr)|v1605(VarCurr)).
% 121.72/120.73  all VarCurr (v1605(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$T)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  -b0011(bitIndex3).
% 121.72/120.73  -b0011(bitIndex2).
% 121.72/120.73  b0011(bitIndex1).
% 121.72/120.73  b0011(bitIndex0).
% 121.72/120.73  all VarCurr (v1602(VarCurr)<->v1603(VarCurr)|v1604(VarCurr)).
% 121.72/120.73  all VarCurr (v1604(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$T)& (v1573(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  -b0010(bitIndex3).
% 121.72/120.73  -b0010(bitIndex2).
% 121.72/120.73  b0010(bitIndex1).
% 121.72/120.73  -b0010(bitIndex0).
% 121.72/120.73  all VarCurr (v1603(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$T)).
% 121.72/120.73  -b0001(bitIndex3).
% 121.72/120.73  -b0001(bitIndex2).
% 121.72/120.73  -b0001(bitIndex1).
% 121.72/120.73  b0001(bitIndex0).
% 121.72/120.73  all VarCurr (-v1416(VarCurr)-> (all B (range_3_0(B)-> (v1598(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1416(VarCurr)-> (all B (range_3_0(B)-> (v1598(VarCurr,B)<->v1579(VarCurr,B))))).
% 121.72/120.73  all VarCurr (v1597(VarCurr)<-> (v1573(VarCurr,bitIndex3)<->$F)& (v1573(VarCurr,bitIndex2)<->$F)& (v1573(VarCurr,bitIndex1)<->$F)& (v1573(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  all B (range_3_0(B)-> (v1573(constB0,B)<->$F)).
% 121.72/120.73  all VarCurr B (range_3_0(B)-> (v1579(VarCurr,B)<->v1581(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_3_0(B)-> (v1581(VarCurr,B)<->v1583(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_3_0(B)-> (v1583(VarCurr,B)<->v1585(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_3_0(B)-> (v1585(VarCurr,B)<->v1587(VarCurr,B))).
% 121.72/120.73  all VarCurr B (range_3_0(B)-> (v1587(VarCurr,B)<->v1589(VarCurr,B))).
% 121.72/120.73  all VarCurr (v1575(VarCurr)<->v1286(VarCurr)).
% 121.72/120.73  all VarCurr (v1416(VarCurr)<->v1418(VarCurr)).
% 121.72/120.73  all VarCurr (v1418(VarCurr)<->v1420(VarCurr)).
% 121.72/120.73  all VarCurr (v1420(VarCurr)<->v1422(VarCurr)).
% 121.72/120.73  all VarCurr (v1422(VarCurr)<->v1424(VarCurr)).
% 121.72/120.73  all VarCurr (v1424(VarCurr)<->v1426(VarCurr)).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1558(VarNext)-> (v1426(VarNext)<->v1426(VarCurr)))).
% 121.72/120.73  all VarNext (v1558(VarNext)-> (v1426(VarNext)<->v1568(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1568(VarNext)<->v1566(VarCurr))).
% 121.72/120.73  all VarCurr (-v1569(VarCurr)-> (v1566(VarCurr)<->v1432(VarCurr))).
% 121.72/120.73  all VarCurr (v1569(VarCurr)-> (v1566(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (-v1569(VarCurr)<->v1428(VarCurr)).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1558(VarNext)<->v1559(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1559(VarNext)<->v1560(VarNext)&v1553(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1560(VarNext)<->v1562(VarNext))).
% 121.72/120.73  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1562(VarNext)<->v1553(VarCurr))).
% 121.72/120.73  all VarCurr (v1553(VarCurr)<->v1555(VarCurr)).
% 121.72/120.73  all VarCurr (v1555(VarCurr)<->v805(VarCurr)).
% 121.72/120.73  all VarCurr (-v1533(VarCurr)-> (v1432(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1533(VarCurr)-> (v1432(VarCurr)<->v1540(VarCurr))).
% 121.72/120.73  all VarCurr (-v1536(VarCurr)& -v1537(VarCurr)-> (v1540(VarCurr)<->v1551(VarCurr))).
% 121.72/120.73  all VarCurr (v1537(VarCurr)-> (v1540(VarCurr)<->$T)).
% 121.72/120.73  all VarCurr (v1536(VarCurr)-> (v1540(VarCurr)<->v1541(VarCurr))).
% 121.72/120.73  all VarCurr (-v1521(VarCurr)-> (v1551(VarCurr)<->$T)).
% 121.72/120.73  all VarCurr (v1521(VarCurr)-> (v1551(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1550(VarCurr)<->v1538(VarCurr)|v1539(VarCurr)).
% 121.72/120.73  all VarCurr (-v1542(VarCurr)-> (v1541(VarCurr)<->$T)).
% 121.72/120.73  all VarCurr (v1542(VarCurr)-> (v1541(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1544(VarCurr)<->v1545(VarCurr)|v1548(VarCurr)).
% 121.72/120.73  all VarCurr (v1548(VarCurr)<->v1436(VarCurr)&v1512(VarCurr)).
% 121.72/120.73  all VarCurr (v1545(VarCurr)<->v1436(VarCurr)&v1547(VarCurr)).
% 121.72/120.73  all VarCurr (-v1547(VarCurr)<->v1512(VarCurr)).
% 121.72/120.73  all VarCurr (-v1542(VarCurr)<->v1436(VarCurr)).
% 121.72/120.73  all VarCurr (v1533(VarCurr)<->v1534(VarCurr)|v1539(VarCurr)).
% 121.72/120.73  all VarCurr (v1539(VarCurr)<-> ($T<->v1434(VarCurr,bitIndex1))).
% 121.72/120.73  all VarCurr (v1534(VarCurr)<->v1535(VarCurr)|v1538(VarCurr)).
% 121.72/120.73  all VarCurr (v1538(VarCurr)<-> ($T<->v1434(VarCurr,bitIndex3))).
% 121.72/120.73  all VarCurr (v1535(VarCurr)<->v1536(VarCurr)|v1537(VarCurr)).
% 121.72/120.73  all VarCurr (v1537(VarCurr)<-> ($T<->v1434(VarCurr,bitIndex2))).
% 121.72/120.73  all VarCurr (v1536(VarCurr)<-> ($T<->v1434(VarCurr,bitIndex0))).
% 121.72/120.73  all VarCurr (v1521(VarCurr)<->v1523(VarCurr)).
% 121.72/120.73  all VarCurr (v1523(VarCurr)<->v1525(VarCurr)).
% 121.72/120.73  all VarCurr (v1525(VarCurr)<->v1527(VarCurr)).
% 121.72/120.73  all VarCurr (v1527(VarCurr)<->v1529(VarCurr)).
% 121.72/120.73  all VarCurr (v1529(VarCurr)<->v1531(VarCurr)).
% 121.72/120.73  all VarCurr (v1512(VarCurr)<->v1514(VarCurr)).
% 121.72/120.73  all VarCurr (v1514(VarCurr)<->v1516(VarCurr)).
% 121.72/120.73  all VarCurr (v1516(VarCurr)<->v1518(VarCurr,bitIndex0)).
% 121.72/120.73  all VarCurr (v1518(VarCurr,bitIndex0)<->v1450(VarCurr,bitIndex49)).
% 121.72/120.73  all VarCurr (v1450(VarCurr,bitIndex49)<->v1452(VarCurr,bitIndex49)).
% 121.72/120.73  all VarCurr (v1452(VarCurr,bitIndex49)<->v1454(VarCurr,bitIndex539)).
% 121.72/120.73  all VarCurr (v1436(VarCurr)<->v1438(VarCurr)).
% 121.72/120.73  all VarCurr (v1438(VarCurr)<->v1440(VarCurr)).
% 121.72/120.73  all VarCurr (-v1505(VarCurr)-> (v1440(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1505(VarCurr)-> (v1440(VarCurr)<->v1510(VarCurr))).
% 121.72/120.73  all VarCurr (-v1499(VarCurr)-> (v1510(VarCurr)<->$F)).
% 121.72/120.73  all VarCurr (v1499(VarCurr)-> (v1510(VarCurr)<->$T)).
% 121.72/120.73  all VarCurr (v1505(VarCurr)<->v1506(VarCurr)&v1509(VarCurr)).
% 121.72/120.73  all VarCurr (v1509(VarCurr)<-> ($T<->v1497(VarCurr,bitIndex0))).
% 121.72/120.73  v1497(constB0,bitIndex0)<->$T.
% 121.72/120.73  all VarCurr (v1506(VarCurr)<->v1507(VarCurr)&v1508(VarCurr)).
% 121.72/120.73  all VarCurr (-v1508(VarCurr)<->v1444(VarCurr)).
% 121.72/120.73  all VarCurr (v1507(VarCurr)<-> (v1442(VarCurr,bitIndex1)<->$T)& (v1442(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  all VarCurr (v1499(VarCurr)<->v1501(VarCurr)).
% 121.72/120.73  all VarCurr (v1501(VarCurr)<->v1503(VarCurr)).
% 121.72/120.73  all VarCurr (v1503(VarCurr)<->v1434(VarCurr,bitIndex0)).
% 121.72/120.73  v1434(constB0,bitIndex3)<->$F.
% 121.72/120.73  v1434(constB0,bitIndex2)<->$F.
% 121.72/120.73  v1434(constB0,bitIndex1)<->$F.
% 121.72/120.73  v1434(constB0,bitIndex0)<->$T.
% 121.72/120.73  all VarCurr (-v1444(VarCurr)-> (all B (range_1_0(B)-> (v1442(VarCurr,B)<->v1468(VarCurr,B))))).
% 121.72/120.73  all VarCurr (v1444(VarCurr)-> (all B (range_1_0(B)-> (v1442(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (-v1469(VarCurr)& -v1489(VarCurr)& -v1490(VarCurr)-> (all B (range_1_0(B)-> (v1468(VarCurr,B)<->$T)))).
% 121.72/120.73  all VarCurr (v1490(VarCurr)-> (all B (range_1_0(B)-> (v1468(VarCurr,B)<->b10(B))))).
% 121.72/120.73  all VarCurr (v1489(VarCurr)-> (all B (range_1_0(B)-> (v1468(VarCurr,B)<->b01(B))))).
% 121.72/120.73  all VarCurr (v1469(VarCurr)-> (all B (range_1_0(B)-> (v1468(VarCurr,B)<->$F)))).
% 121.72/120.73  all VarCurr (v1490(VarCurr)<->v1492(VarCurr)|v1495(VarCurr)).
% 121.72/120.73  all VarCurr (v1495(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$T)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.73  b1001010(bitIndex6).
% 121.72/120.73  -b1001010(bitIndex5).
% 121.72/120.73  -b1001010(bitIndex4).
% 121.72/120.73  b1001010(bitIndex3).
% 121.72/120.73  -b1001010(bitIndex2).
% 121.72/120.73  b1001010(bitIndex1).
% 121.72/120.73  -b1001010(bitIndex0).
% 121.72/120.73  all VarCurr (v1492(VarCurr)<->v1493(VarCurr)|v1494(VarCurr)).
% 121.72/120.73  all VarCurr (v1494(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$T)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$T)).
% 121.72/120.74  -b0001011(bitIndex6).
% 121.72/120.74  -b0001011(bitIndex5).
% 121.72/120.74  -b0001011(bitIndex4).
% 121.72/120.74  b0001011(bitIndex3).
% 121.72/120.74  -b0001011(bitIndex2).
% 121.72/120.74  b0001011(bitIndex1).
% 121.72/120.74  b0001011(bitIndex0).
% 121.72/120.74  all VarCurr (v1493(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$T)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  -b0001010(bitIndex6).
% 121.72/120.74  -b0001010(bitIndex5).
% 121.72/120.74  -b0001010(bitIndex4).
% 121.72/120.74  b0001010(bitIndex3).
% 121.72/120.74  -b0001010(bitIndex2).
% 121.72/120.74  b0001010(bitIndex1).
% 121.72/120.74  -b0001010(bitIndex0).
% 121.72/120.74  all VarCurr (v1489(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$T)& (v1448(VarCurr,bitIndex4)<->$T)& (v1448(VarCurr,bitIndex3)<->$T)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  b1111010(bitIndex6).
% 121.72/120.74  b1111010(bitIndex5).
% 121.72/120.74  b1111010(bitIndex4).
% 121.72/120.74  b1111010(bitIndex3).
% 121.72/120.74  -b1111010(bitIndex2).
% 121.72/120.74  b1111010(bitIndex1).
% 121.72/120.74  -b1111010(bitIndex0).
% 121.72/120.74  all VarCurr (v1469(VarCurr)<->v1471(VarCurr)|v1488(VarCurr)).
% 121.72/120.74  all VarCurr (v1488(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$T)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$T)).
% 121.72/120.74  b1000101(bitIndex6).
% 121.72/120.74  -b1000101(bitIndex5).
% 121.72/120.74  -b1000101(bitIndex4).
% 121.72/120.74  -b1000101(bitIndex3).
% 121.72/120.74  b1000101(bitIndex2).
% 121.72/120.74  -b1000101(bitIndex1).
% 121.72/120.74  b1000101(bitIndex0).
% 121.72/120.74  all VarCurr (v1471(VarCurr)<->v1472(VarCurr)|v1487(VarCurr)).
% 121.72/120.74  all VarCurr (v1487(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$T)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  b1000100(bitIndex6).
% 121.72/120.74  -b1000100(bitIndex5).
% 121.72/120.74  -b1000100(bitIndex4).
% 121.72/120.74  -b1000100(bitIndex3).
% 121.72/120.74  b1000100(bitIndex2).
% 121.72/120.74  -b1000100(bitIndex1).
% 121.72/120.74  -b1000100(bitIndex0).
% 121.72/120.74  all VarCurr (v1472(VarCurr)<->v1473(VarCurr)|v1486(VarCurr)).
% 121.72/120.74  all VarCurr (v1486(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$T)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  b1100000(bitIndex6).
% 121.72/120.74  b1100000(bitIndex5).
% 121.72/120.74  -b1100000(bitIndex4).
% 121.72/120.74  -b1100000(bitIndex3).
% 121.72/120.74  -b1100000(bitIndex2).
% 121.72/120.74  -b1100000(bitIndex1).
% 121.72/120.74  -b1100000(bitIndex0).
% 121.72/120.74  all VarCurr (v1473(VarCurr)<->v1474(VarCurr)|v1485(VarCurr)).
% 121.72/120.74  all VarCurr (v1485(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  b1000000(bitIndex6).
% 121.72/120.74  -b1000000(bitIndex5).
% 121.72/120.74  -b1000000(bitIndex4).
% 121.72/120.74  -b1000000(bitIndex3).
% 121.72/120.74  -b1000000(bitIndex2).
% 121.72/120.74  -b1000000(bitIndex1).
% 121.72/120.74  -b1000000(bitIndex0).
% 121.72/120.74  all VarCurr (v1474(VarCurr)<->v1475(VarCurr)|v1484(VarCurr)).
% 121.72/120.74  all VarCurr (v1484(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$T)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  b1000010(bitIndex6).
% 121.72/120.74  -b1000010(bitIndex5).
% 121.72/120.74  -b1000010(bitIndex4).
% 121.72/120.74  -b1000010(bitIndex3).
% 121.72/120.74  -b1000010(bitIndex2).
% 121.72/120.74  b1000010(bitIndex1).
% 121.72/120.74  -b1000010(bitIndex0).
% 121.72/120.74  all VarCurr (v1475(VarCurr)<->v1476(VarCurr)|v1483(VarCurr)).
% 121.72/120.74  all VarCurr (v1483(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$T)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$T)).
% 121.72/120.74  -b0000101(bitIndex6).
% 121.72/120.74  -b0000101(bitIndex5).
% 121.72/120.74  -b0000101(bitIndex4).
% 121.72/120.74  -b0000101(bitIndex3).
% 121.72/120.74  b0000101(bitIndex2).
% 121.72/120.74  -b0000101(bitIndex1).
% 121.72/120.74  b0000101(bitIndex0).
% 121.72/120.74  all VarCurr (v1476(VarCurr)<->v1477(VarCurr)|v1482(VarCurr)).
% 121.72/120.74  all VarCurr (v1482(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$T)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  -b0000100(bitIndex6).
% 121.72/120.74  -b0000100(bitIndex5).
% 121.72/120.74  -b0000100(bitIndex4).
% 121.72/120.74  -b0000100(bitIndex3).
% 121.72/120.74  b0000100(bitIndex2).
% 121.72/120.74  -b0000100(bitIndex1).
% 121.72/120.74  -b0000100(bitIndex0).
% 121.72/120.74  all VarCurr (v1477(VarCurr)<->v1478(VarCurr)|v1481(VarCurr)).
% 121.72/120.74  all VarCurr (v1481(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$T)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  -b0000010(bitIndex6).
% 121.72/120.74  -b0000010(bitIndex5).
% 121.72/120.74  -b0000010(bitIndex4).
% 121.72/120.74  -b0000010(bitIndex3).
% 121.72/120.74  -b0000010(bitIndex2).
% 121.72/120.74  b0000010(bitIndex1).
% 121.72/120.74  -b0000010(bitIndex0).
% 121.72/120.74  all VarCurr (v1478(VarCurr)<->v1479(VarCurr)|v1480(VarCurr)).
% 121.72/120.74  all VarCurr (v1480(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$T)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  -b0100000(bitIndex6).
% 121.72/120.74  b0100000(bitIndex5).
% 121.72/120.74  -b0100000(bitIndex4).
% 121.72/120.74  -b0100000(bitIndex3).
% 121.72/120.74  -b0100000(bitIndex2).
% 121.72/120.74  -b0100000(bitIndex1).
% 121.72/120.74  -b0100000(bitIndex0).
% 121.72/120.74  all VarCurr (v1479(VarCurr)<-> (v1448(VarCurr,bitIndex6)<->$F)& (v1448(VarCurr,bitIndex5)<->$F)& (v1448(VarCurr,bitIndex4)<->$F)& (v1448(VarCurr,bitIndex3)<->$F)& (v1448(VarCurr,bitIndex2)<->$F)& (v1448(VarCurr,bitIndex1)<->$F)& (v1448(VarCurr,bitIndex0)<->$F)).
% 121.72/120.74  -b0000000(bitIndex6).
% 121.72/120.74  -b0000000(bitIndex5).
% 121.72/120.74  -b0000000(bitIndex4).
% 121.72/120.74  -b0000000(bitIndex3).
% 121.72/120.74  -b0000000(bitIndex2).
% 121.72/120.74  -b0000000(bitIndex1).
% 121.72/120.74  -b0000000(bitIndex0).
% 121.72/120.74  all VarCurr ((v1448(VarCurr,bitIndex6)<->v1450(VarCurr,bitIndex69))& (v1448(VarCurr,bitIndex5)<->v1450(VarCurr,bitIndex68))& (v1448(VarCurr,bitIndex4)<->v1450(VarCurr,bitIndex67))& (v1448(VarCurr,bitIndex3)<->v1450(VarCurr,bitIndex66))& (v1448(VarCurr,bitIndex2)<->v1450(VarCurr,bitIndex65))& (v1448(VarCurr,bitIndex1)<->v1450(VarCurr,bitIndex64))& (v1448(VarCurr,bitIndex0)<->v1450(VarCurr,bitIndex63))).
% 121.72/120.74  all VarCurr B (range_69_63(B)-> (v1450(VarCurr,B)<->v1452(VarCurr,B))).
% 121.72/120.74  all B (range_69_63(B)<->bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B|bitIndex67=B|bitIndex68=B|bitIndex69=B).
% 121.72/120.74  all VarCurr ((v1452(VarCurr,bitIndex69)<->v1454(VarCurr,bitIndex559))& (v1452(VarCurr,bitIndex68)<->v1454(VarCurr,bitIndex558))& (v1452(VarCurr,bitIndex67)<->v1454(VarCurr,bitIndex557))& (v1452(VarCurr,bitIndex66)<->v1454(VarCurr,bitIndex556))& (v1452(VarCurr,bitIndex65)<->v1454(VarCurr,bitIndex555))& (v1452(VarCurr,bitIndex64)<->v1454(VarCurr,bitIndex554))& (v1452(VarCurr,bitIndex63)<->v1454(VarCurr,bitIndex553))).
% 121.72/120.74  -v1454(constB0,bitIndex559).
% 121.72/120.74  -v1454(constB0,bitIndex558).
% 121.72/120.74  -v1454(constB0,bitIndex557).
% 121.72/120.74  -v1454(constB0,bitIndex556).
% 121.72/120.74  -v1454(constB0,bitIndex555).
% 121.72/120.74  -v1454(constB0,bitIndex554).
% 121.72/120.74  -v1454(constB0,bitIndex553).
% 121.72/120.74  -v1454(constB0,bitIndex539).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex69).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex68).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex67).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex66).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex65).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex64).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex63).
% 121.72/120.74  -b0000000xxxxxxxxxxxxx0xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx(bitIndex49).
% 121.72/120.74  all VarCurr (v1444(VarCurr)<->v1446(VarCurr)).
% 121.72/120.74  all VarCurr (v1428(VarCurr)<->v1430(VarCurr)).
% 121.72/120.74  all VarCurr (v1430(VarCurr)<->v12(VarCurr)).
% 121.72/120.74  all VarCurr (v1389(VarCurr)<->v1391(VarCurr)).
% 121.72/120.74  all VarCurr (v1391(VarCurr)<->v1393(VarCurr)).
% 121.72/120.74  all VarCurr (-v1396(VarCurr)-> (v1393(VarCurr)<->$F)).
% 121.72/120.74  all VarCurr (v1396(VarCurr)-> (v1393(VarCurr)<->v1407(VarCurr))).
% 121.72/120.74  all VarCurr (-v1398(VarCurr)-> (v1407(VarCurr)<->$T)).
% 121.72/120.74  all VarCurr (v1398(VarCurr)-> (v1407(VarCurr)<->$F)).
% 121.72/120.74  all VarCurr (v1408(VarCurr)<->v1401(VarCurr)|v1404(VarCurr)).
% 121.72/120.75  all VarCurr (v1396(VarCurr)<->v1397(VarCurr)|v1404(VarCurr)).
% 121.72/120.75  all VarCurr (v1404(VarCurr)<->v1405(VarCurr)&v1406(VarCurr)).
% 121.72/120.75  all VarCurr (v1406(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$T)& (v1290(VarCurr,bitIndex2)<->$T)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  b1101(bitIndex3).
% 121.72/120.75  b1101(bitIndex2).
% 121.72/120.75  -b1101(bitIndex1).
% 121.72/120.75  b1101(bitIndex0).
% 121.72/120.75  all VarCurr (-v1405(VarCurr)<->v1379(VarCurr)).
% 121.72/120.75  all VarCurr (v1397(VarCurr)<->v1398(VarCurr)|v1401(VarCurr)).
% 121.72/120.75  all VarCurr (v1401(VarCurr)<->v1402(VarCurr)&v1403(VarCurr)).
% 121.72/120.75  all VarCurr (v1403(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$T)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (-v1402(VarCurr)<->v1379(VarCurr)).
% 121.72/120.75  all VarCurr (v1398(VarCurr)<->v1399(VarCurr)&v1400(VarCurr)).
% 121.72/120.75  all VarCurr (v1400(VarCurr)<-> (v1290(VarCurr,bitIndex3)<->$F)& (v1290(VarCurr,bitIndex2)<->$F)& (v1290(VarCurr,bitIndex1)<->$F)& (v1290(VarCurr,bitIndex0)<->$F)).
% 121.72/120.75  all B (range_3_0(B)-> (v1290(constB0,B)<->$F)).
% 121.72/120.75  all VarCurr (-v1399(VarCurr)<->v1294(VarCurr)).
% 121.72/120.75  all VarCurr (v1387(VarCurr)<->v1286(VarCurr)).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1364(VarCurr,B)<->v1366(VarCurr,B))).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1366(VarCurr,B)<->v1368(VarCurr,B))).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1368(VarCurr,B)<->v1370(VarCurr,B))).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1370(VarCurr,B)<->v1372(VarCurr,B))).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1372(VarCurr,B)<->v1374(VarCurr,B))).
% 121.72/120.75  all VarCurr B (range_3_0(B)-> (v1374(VarCurr,B)<->b0101(B))).
% 121.72/120.75  -b0101(bitIndex3).
% 121.72/120.75  b0101(bitIndex2).
% 121.72/120.75  -b0101(bitIndex1).
% 121.72/120.75  b0101(bitIndex0).
% 121.72/120.75  all VarCurr (v1294(VarCurr)<->v1296(VarCurr)).
% 121.72/120.75  all VarCurr (v1296(VarCurr)<->v1298(VarCurr)).
% 121.72/120.75  all VarCurr (v1298(VarCurr)<->v1300(VarCurr)).
% 121.72/120.75  all VarCurr (v1300(VarCurr)<->v1302(VarCurr)).
% 121.72/120.75  all VarCurr (v1302(VarCurr)<->v1304(VarCurr)).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1351(VarNext)-> (v1304(VarNext)<->v1304(VarCurr)))).
% 121.72/120.75  all VarNext (v1351(VarNext)-> (v1304(VarNext)<->v1359(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1359(VarNext)<->v1357(VarCurr))).
% 121.72/120.75  all VarCurr (-v1360(VarCurr)-> (v1357(VarCurr)<->v1306(VarCurr))).
% 121.72/120.75  all VarCurr (v1360(VarCurr)-> (v1357(VarCurr)<->$F)).
% 121.72/120.75  all VarCurr (-v1360(VarCurr)<->v8(VarCurr)).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1351(VarNext)<->v1352(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1352(VarNext)<->v1353(VarNext)&v1252(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1353(VarNext)<->v1259(VarNext))).
% 121.72/120.75  all VarCurr (-v1311(VarCurr)-> (v1306(VarCurr)<->$F)).
% 121.72/120.75  all VarCurr (v1311(VarCurr)-> (v1306(VarCurr)<->$T)).
% 121.72/120.75  all VarCurr (v1311(VarCurr)<->v1312(VarCurr)|v1345(VarCurr)).
% 121.72/120.75  all VarCurr (v1345(VarCurr)<->v1346(VarCurr)&v1348(VarCurr)).
% 121.72/120.75  all VarCurr (v1348(VarCurr)<-> ($T<->v6(VarCurr,bitIndex4))).
% 121.72/120.75  all VarCurr (v1346(VarCurr)<-> (v1347(VarCurr,bitIndex1)<->$T)& (v1347(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1347(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.72/120.75  all VarCurr (v1347(VarCurr,bitIndex1)<->v1193(VarCurr)).
% 121.72/120.75  all VarCurr (v1312(VarCurr)<->v1313(VarCurr)|v1334(VarCurr)).
% 121.72/120.75  all VarCurr (v1334(VarCurr)<->v1335(VarCurr)&v1344(VarCurr)).
% 121.72/120.75  all VarCurr (v1344(VarCurr)<-> ($T<->v6(VarCurr,bitIndex3))).
% 121.72/120.75  all VarCurr (v1335(VarCurr)<->v1336(VarCurr)|v1341(VarCurr)).
% 121.72/120.75  all VarCurr (v1341(VarCurr)<->v1308(VarCurr)&v1342(VarCurr)).
% 121.72/120.75  all VarCurr (v1342(VarCurr)<-> (v1343(VarCurr,bitIndex1)<->$T)& (v1343(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1343(VarCurr,bitIndex0)<->v1193(VarCurr)).
% 121.72/120.75  all VarCurr (v1343(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1336(VarCurr)<->v1337(VarCurr)|v1339(VarCurr)).
% 121.72/120.75  all VarCurr (v1339(VarCurr)<-> (v1340(VarCurr,bitIndex1)<->$F)& (v1340(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1340(VarCurr,bitIndex0)<->v1193(VarCurr)).
% 121.72/120.75  all VarCurr (v1340(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1337(VarCurr)<-> (v1338(VarCurr,bitIndex1)<->$F)& (v1338(VarCurr,bitIndex0)<->$F)).
% 121.72/120.75  all VarCurr (v1338(VarCurr,bitIndex0)<->v1193(VarCurr)).
% 121.72/120.75  all VarCurr (v1338(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1313(VarCurr)<->v1314(VarCurr)|v1324(VarCurr)).
% 121.72/120.75  all VarCurr (v1324(VarCurr)<->v1325(VarCurr)&v1333(VarCurr)).
% 121.72/120.75  all VarCurr (v1333(VarCurr)<-> ($T<->v6(VarCurr,bitIndex2))).
% 121.72/120.75  all VarCurr (v1325(VarCurr)<->v1326(VarCurr)|v1331(VarCurr)).
% 121.72/120.75  all VarCurr (v1331(VarCurr)<-> (v1332(VarCurr,bitIndex1)<->$T)& (v1332(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1332(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.72/120.75  all VarCurr (v1332(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1326(VarCurr)<->v1327(VarCurr)|v1329(VarCurr)).
% 121.72/120.75  all VarCurr (v1329(VarCurr)<-> (v1330(VarCurr,bitIndex1)<->$F)& (v1330(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1330(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.72/120.75  all VarCurr (v1330(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1327(VarCurr)<-> (v1328(VarCurr,bitIndex1)<->$F)& (v1328(VarCurr,bitIndex0)<->$F)).
% 121.72/120.75  all VarCurr (v1328(VarCurr,bitIndex0)<->v1308(VarCurr)).
% 121.72/120.75  all VarCurr (v1328(VarCurr,bitIndex1)<->v1272(VarCurr)).
% 121.72/120.75  all VarCurr (v1314(VarCurr)<->v1315(VarCurr)|v1323(VarCurr)).
% 121.72/120.75  all VarCurr (v1323(VarCurr)<-> ($T<->v6(VarCurr,bitIndex1))).
% 121.72/120.75  all VarCurr (v1315(VarCurr)<->v1316(VarCurr)&v1322(VarCurr)).
% 121.72/120.75  all VarCurr (v1322(VarCurr)<-> ($T<->v6(VarCurr,bitIndex0))).
% 121.72/120.75  v6(constB0,bitIndex4)<->$F.
% 121.72/120.75  v6(constB0,bitIndex3)<->$F.
% 121.72/120.75  v6(constB0,bitIndex2)<->$F.
% 121.72/120.75  v6(constB0,bitIndex1)<->$F.
% 121.72/120.75  v6(constB0,bitIndex0)<->$T.
% 121.72/120.75  all VarCurr (v1316(VarCurr)<->v1317(VarCurr)&v1320(VarCurr)).
% 121.72/120.75  all VarCurr (v1320(VarCurr)<->v1321(VarCurr)&v1193(VarCurr)).
% 121.72/120.75  all VarCurr (-v1321(VarCurr)<->v23(VarCurr)).
% 121.72/120.75  all VarCurr (v1317(VarCurr)<->v1318(VarCurr)|v1319(VarCurr)).
% 121.72/120.75  all VarCurr (v1319(VarCurr)<-> (v21(VarCurr,bitIndex1)<->$T)& (v21(VarCurr,bitIndex0)<->$F)).
% 121.72/120.75  all VarCurr (v1318(VarCurr)<-> (v21(VarCurr,bitIndex1)<->$F)& (v21(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1308(VarCurr)<-> -(v21(VarCurr,bitIndex1)<->v21(VarCurr,bitIndex0))).
% 121.72/120.75  all VarCurr (v1284(VarCurr)<->v1286(VarCurr)).
% 121.72/120.75  all VarCurr (v1286(VarCurr)<->v14(VarCurr)).
% 121.72/120.75  all VarCurr (v1193(VarCurr)<->v1245(VarCurr)|v1195(VarCurr,bitIndex2)).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1255(VarNext)-> (all B (range_2_0(B)-> (v1195(VarNext,B)<->v1195(VarCurr,B)))))).
% 121.72/120.75  all VarNext (v1255(VarNext)-> (all B (range_2_0(B)-> (v1195(VarNext,B)<->v1265(VarNext,B))))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v1265(VarNext,B)<->v1263(VarCurr,B))))).
% 121.72/120.75  all VarCurr (-v1266(VarCurr)-> (all B (range_2_0(B)-> (v1263(VarCurr,B)<->v1198(VarCurr,B))))).
% 121.72/120.75  all VarCurr (v1266(VarCurr)-> (all B (range_2_0(B)-> (v1263(VarCurr,B)<->b100(B))))).
% 121.72/120.75  all VarCurr (-v1266(VarCurr)<->v8(VarCurr)).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1255(VarNext)<->v1256(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1256(VarNext)<->v1257(VarNext)&v1252(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1257(VarNext)<->v1259(VarNext))).
% 121.72/120.75  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1259(VarNext)<->v1252(VarCurr))).
% 121.72/120.75  all VarCurr (v1252(VarCurr)<->v803(VarCurr)).
% 121.72/120.75  all VarCurr (-v1218(VarCurr)& -v1234(VarCurr)-> (all B (range_2_0(B)-> (v1198(VarCurr,B)<->v1195(VarCurr,B))))).
% 121.72/120.75  all VarCurr (v1234(VarCurr)-> (all B (range_2_0(B)-> (v1198(VarCurr,B)<->v1236(VarCurr,B))))).
% 121.72/120.75  all VarCurr (v1218(VarCurr)-> (all B (range_2_0(B)-> (v1198(VarCurr,B)<->v1220(VarCurr,B))))).
% 121.72/120.75  all VarCurr (v1246(VarCurr)<->v1247(VarCurr)|v1249(VarCurr)).
% 121.72/120.75  all VarCurr (v1249(VarCurr)<-> (v1250(VarCurr,bitIndex1)<->$T)& (v1250(VarCurr,bitIndex0)<->$T)).
% 121.72/120.75  all VarCurr (v1250(VarCurr,bitIndex0)<->v1191(VarCurr)).
% 121.72/120.75  all VarCurr (v1250(VarCurr,bitIndex1)<->v1200(VarCurr)).
% 121.72/120.75  all VarCurr (v1247(VarCurr)<-> (v1248(VarCurr,bitIndex1)<->$F)& (v1248(VarCurr,bitIndex0)<->$F)).
% 121.72/120.75  all VarCurr (v1248(VarCurr,bitIndex0)<->v1191(VarCurr)).
% 121.72/120.75  all VarCurr (v1248(VarCurr,bitIndex1)<->v1200(VarCurr)).
% 121.72/120.75  all VarCurr (v1236(VarCurr,bitIndex0)<->v1232(VarCurr)).
% 121.72/120.75  all VarCurr (v1236(VarCurr,bitIndex1)<->v1243(VarCurr)).
% 121.72/120.75  all VarCurr (v1236(VarCurr,bitIndex2)<->v1238(VarCurr)).
% 121.72/120.75  all VarCurr (v1243(VarCurr)<->v1244(VarCurr)&v1245(VarCurr)).
% 121.72/120.75  all VarCurr (v1245(VarCurr)<->v1195(VarCurr,bitIndex0)|v1195(VarCurr,bitIndex1)).
% 121.72/120.75  all VarCurr (v1244(VarCurr)<->v1232(VarCurr)|v1227(VarCurr)).
% 121.72/120.76  all VarCurr (v1238(VarCurr)<->v1239(VarCurr)&v1242(VarCurr)).
% 121.72/120.76  all VarCurr (v1242(VarCurr)<->v1195(VarCurr,bitIndex2)|v1241(VarCurr)).
% 121.72/120.76  all VarCurr (v1239(VarCurr)<->v1229(VarCurr)|v1240(VarCurr)).
% 121.72/120.76  all VarCurr (-v1240(VarCurr)<->v1241(VarCurr)).
% 121.72/120.76  all VarCurr (v1241(VarCurr)<->v1195(VarCurr,bitIndex0)&v1195(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (v1234(VarCurr)<-> (v1235(VarCurr,bitIndex1)<->$T)& (v1235(VarCurr,bitIndex0)<->$F)).
% 121.72/120.76  all VarCurr (v1235(VarCurr,bitIndex0)<->v1191(VarCurr)).
% 121.72/120.76  all VarCurr (v1235(VarCurr,bitIndex1)<->v1200(VarCurr)).
% 121.72/120.76  all VarCurr (v1220(VarCurr,bitIndex0)<->v1232(VarCurr)).
% 121.72/120.76  all VarCurr (v1220(VarCurr,bitIndex1)<->v1230(VarCurr)).
% 121.72/120.76  all VarCurr (v1220(VarCurr,bitIndex2)<->v1222(VarCurr)).
% 121.72/120.76  all VarCurr (v1230(VarCurr)<->v1231(VarCurr)&v1233(VarCurr)).
% 121.72/120.76  all VarCurr (v1233(VarCurr)<->v1195(VarCurr,bitIndex0)|v1227(VarCurr)).
% 121.72/120.76  all VarCurr (v1231(VarCurr)<->v1232(VarCurr)|v1195(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (-v1232(VarCurr)<->v1195(VarCurr,bitIndex0)).
% 121.72/120.76  all VarCurr (v1222(VarCurr)<->v1223(VarCurr)&v1228(VarCurr)).
% 121.72/120.76  all VarCurr (v1228(VarCurr)<->v1225(VarCurr)|v1229(VarCurr)).
% 121.72/120.76  all VarCurr (-v1229(VarCurr)<->v1195(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (v1223(VarCurr)<->v1224(VarCurr)|v1195(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (-v1224(VarCurr)<->v1225(VarCurr)).
% 121.72/120.76  all VarCurr (v1225(VarCurr)<->v1195(VarCurr,bitIndex1)|v1226(VarCurr)).
% 121.72/120.76  all VarCurr (v1226(VarCurr)<->v1195(VarCurr,bitIndex0)&v1227(VarCurr)).
% 121.72/120.76  all VarCurr (-v1227(VarCurr)<->v1195(VarCurr,bitIndex1)).
% 121.72/120.76  v1195(constB0,bitIndex2).
% 121.72/120.76  -v1195(constB0,bitIndex1).
% 121.72/120.76  -v1195(constB0,bitIndex0).
% 121.72/120.76  b100(bitIndex2).
% 121.72/120.76  -b100(bitIndex1).
% 121.72/120.76  -b100(bitIndex0).
% 121.72/120.76  all VarCurr (v1218(VarCurr)<-> (v1219(VarCurr,bitIndex1)<->$F)& (v1219(VarCurr,bitIndex0)<->$T)).
% 121.72/120.76  all VarCurr (v1219(VarCurr,bitIndex0)<->v1191(VarCurr)).
% 121.72/120.76  all VarCurr (v1219(VarCurr,bitIndex1)<->v1200(VarCurr)).
% 121.72/120.76  all VarCurr (v1200(VarCurr)<->v1202(VarCurr)).
% 121.72/120.76  all VarCurr (v1202(VarCurr)<->v1204(VarCurr)).
% 121.72/120.76  all VarCurr (v1204(VarCurr)<->v1206(VarCurr)).
% 121.72/120.76  all VarCurr (-v1209(VarCurr)-> (v1206(VarCurr)<->$F)).
% 121.72/120.76  all VarCurr (v1209(VarCurr)-> (v1206(VarCurr)<->v1216(VarCurr))).
% 121.72/120.76  all VarCurr (-v1210(VarCurr)-> (v1216(VarCurr)<->$F)).
% 121.72/120.76  all VarCurr (v1210(VarCurr)-> (v1216(VarCurr)<->$T)).
% 121.72/120.76  all VarCurr (v1209(VarCurr)<->v1210(VarCurr)|v1212(VarCurr)).
% 121.72/120.76  all VarCurr (-v1212(VarCurr)<->v1213(VarCurr)).
% 121.72/120.76  all VarCurr (v1213(VarCurr)<->v1210(VarCurr)|v1214(VarCurr)).
% 121.72/120.76  all VarCurr (v1214(VarCurr)<-> (v1215(VarCurr,bitIndex2)<->$F)& (v1215(VarCurr,bitIndex1)<->$T)& (v1215(VarCurr,bitIndex0)<->$F)).
% 121.72/120.76  -b010(bitIndex2).
% 121.72/120.76  b010(bitIndex1).
% 121.72/120.76  -b010(bitIndex0).
% 121.72/120.76  all VarCurr (v1215(VarCurr,bitIndex0)<->v1173(VarCurr)).
% 121.72/120.76  all VarCurr (v1215(VarCurr,bitIndex1)<->v747(VarCurr)).
% 121.72/120.76  all VarCurr (v1215(VarCurr,bitIndex2)<->v681(VarCurr)).
% 121.72/120.76  all VarCurr (v1210(VarCurr)<-> (v1211(VarCurr,bitIndex2)<->$F)& (v1211(VarCurr,bitIndex1)<->$F)& (v1211(VarCurr,bitIndex0)<->$T)).
% 121.72/120.76  -b001(bitIndex2).
% 121.72/120.76  -b001(bitIndex1).
% 121.72/120.76  b001(bitIndex0).
% 121.72/120.76  all VarCurr (v1211(VarCurr,bitIndex0)<->v1173(VarCurr)).
% 121.72/120.76  all VarCurr (v1211(VarCurr,bitIndex1)<->v747(VarCurr)).
% 121.72/120.76  all VarCurr (v1211(VarCurr,bitIndex2)<->v681(VarCurr)).
% 121.72/120.76  all VarCurr (v1183(VarCurr)<->v743(VarCurr)).
% 121.72/120.76  all VarCurr (v747(VarCurr)<->v1170(VarCurr)&v1171(VarCurr)).
% 121.72/120.76  all VarCurr (-v1171(VarCurr)<->v1080(VarCurr)).
% 121.72/120.76  all VarCurr (v1170(VarCurr)<-> (v749(VarCurr,bitIndex7)<->v823(VarCurr,bitIndex7))& (v749(VarCurr,bitIndex6)<->v823(VarCurr,bitIndex6))& (v749(VarCurr,bitIndex5)<->v823(VarCurr,bitIndex5))& (v749(VarCurr,bitIndex4)<->v823(VarCurr,bitIndex4))& (v749(VarCurr,bitIndex3)<->v823(VarCurr,bitIndex3))& (v749(VarCurr,bitIndex2)<->v823(VarCurr,bitIndex2))& (v749(VarCurr,bitIndex1)<->v823(VarCurr,bitIndex1))& (v749(VarCurr,bitIndex0)<->v823(VarCurr,bitIndex0))).
% 121.72/120.76  all VarCurr (v1080(VarCurr)<->v1082(VarCurr)).
% 121.72/120.76  all VarCurr (v1082(VarCurr)<-> (v1084(VarCurr,bitIndex3)<->$F)& (v1084(VarCurr,bitIndex2)<->$F)& (v1084(VarCurr,bitIndex1)<->$F)& (v1084(VarCurr,bitIndex0)<->$F)).
% 121.72/120.76  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1156(VarNext)-> (all B (range_3_0(B)-> (v1084(VarNext,B)<->v1084(VarCurr,B)))))).
% 121.72/120.76  all VarNext (v1156(VarNext)-> (all B (range_3_0(B)-> (v1084(VarNext,B)<->v1164(VarNext,B))))).
% 121.72/120.76  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1164(VarNext,B)<->v1162(VarCurr,B))))).
% 121.72/120.76  all VarCurr (-v1165(VarCurr)-> (all B (range_3_0(B)-> (v1162(VarCurr,B)<->v1086(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1165(VarCurr)-> (all B (range_3_0(B)-> (v1162(VarCurr,B)<->$F)))).
% 121.72/120.76  all VarCurr (-v1165(VarCurr)<->v834(VarCurr)).
% 121.72/120.76  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1156(VarNext)<->v1157(VarNext))).
% 121.72/120.76  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1157(VarNext)<->v1158(VarNext)&v831(VarNext))).
% 121.72/120.76  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1158(VarNext)<->v1004(VarNext))).
% 121.72/120.76  all VarCurr (-v1091(VarCurr)& -v1093(VarCurr)& -v1129(VarCurr)-> (all B (range_3_0(B)-> (v1086(VarCurr,B)<->v1084(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1129(VarCurr)-> (all B (range_3_0(B)-> (v1086(VarCurr,B)<->v1131(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1093(VarCurr)-> (all B (range_3_0(B)-> (v1086(VarCurr,B)<->v1095(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1091(VarCurr)-> (all B (range_3_0(B)-> (v1086(VarCurr,B)<->v1084(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1152(VarCurr)<-> (v1153(VarCurr,bitIndex1)<->$T)& (v1153(VarCurr,bitIndex0)<->$T)).
% 121.72/120.76  all VarCurr (v1153(VarCurr,bitIndex0)<->v1049(VarCurr)).
% 121.72/120.76  all VarCurr (v1153(VarCurr,bitIndex1)<->v944(VarCurr)).
% 121.72/120.76  all VarCurr (-v1132(VarCurr)-> (all B (range_3_0(B)-> (v1131(VarCurr,B)<->v1133(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1132(VarCurr)-> (all B (range_3_0(B)-> (v1131(VarCurr,B)<->b1000(B))))).
% 121.72/120.76  all VarCurr (v1133(VarCurr,bitIndex0)<->v1149(VarCurr)).
% 121.72/120.76  all VarCurr (v1133(VarCurr,bitIndex1)<->v1147(VarCurr)).
% 121.72/120.76  all VarCurr (v1133(VarCurr,bitIndex2)<->v1142(VarCurr)).
% 121.72/120.76  all VarCurr (v1133(VarCurr,bitIndex3)<->v1135(VarCurr)).
% 121.72/120.76  all VarCurr (v1147(VarCurr)<->v1148(VarCurr)&v1151(VarCurr)).
% 121.72/120.76  all VarCurr (v1151(VarCurr)<->v1084(VarCurr,bitIndex0)|v1084(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (v1148(VarCurr)<->v1149(VarCurr)|v1150(VarCurr)).
% 121.72/120.76  all VarCurr (-v1150(VarCurr)<->v1084(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (-v1149(VarCurr)<->v1084(VarCurr,bitIndex0)).
% 121.72/120.76  all VarCurr (v1142(VarCurr)<->v1143(VarCurr)&v1146(VarCurr)).
% 121.72/120.76  all VarCurr (v1146(VarCurr)<->v1139(VarCurr)|v1084(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (v1143(VarCurr)<->v1144(VarCurr)|v1145(VarCurr)).
% 121.72/120.76  all VarCurr (-v1145(VarCurr)<->v1084(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (-v1144(VarCurr)<->v1139(VarCurr)).
% 121.72/120.76  all VarCurr (v1135(VarCurr)<->v1136(VarCurr)&v1141(VarCurr)).
% 121.72/120.76  all VarCurr (v1141(VarCurr)<->v1138(VarCurr)|v1084(VarCurr,bitIndex3)).
% 121.72/120.76  all VarCurr (v1136(VarCurr)<->v1137(VarCurr)|v1140(VarCurr)).
% 121.72/120.76  all VarCurr (-v1140(VarCurr)<->v1084(VarCurr,bitIndex3)).
% 121.72/120.76  all VarCurr (-v1137(VarCurr)<->v1138(VarCurr)).
% 121.72/120.76  all VarCurr (v1138(VarCurr)<->v1139(VarCurr)&v1084(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (v1139(VarCurr)<->v1084(VarCurr,bitIndex0)&v1084(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (v1132(VarCurr)<-> (v1084(VarCurr,bitIndex3)<->$T)& (v1084(VarCurr,bitIndex2)<->$F)& (v1084(VarCurr,bitIndex1)<->$F)& (v1084(VarCurr,bitIndex0)<->$F)).
% 121.72/120.76  b1000(bitIndex3).
% 121.72/120.76  -b1000(bitIndex2).
% 121.72/120.76  -b1000(bitIndex1).
% 121.72/120.76  -b1000(bitIndex0).
% 121.72/120.76  all VarCurr (v1129(VarCurr)<-> (v1130(VarCurr,bitIndex1)<->$T)& (v1130(VarCurr,bitIndex0)<->$F)).
% 121.72/120.76  all VarCurr (v1130(VarCurr,bitIndex0)<->v1049(VarCurr)).
% 121.72/120.76  all VarCurr (v1130(VarCurr,bitIndex1)<->v944(VarCurr)).
% 121.72/120.76  all VarCurr (-v1096(VarCurr)-> (all B (range_31_0(B)-> (v1095(VarCurr,B)<->v1097(VarCurr,B))))).
% 121.72/120.76  all VarCurr (v1096(VarCurr)-> (all B (range_31_0(B)-> (v1095(VarCurr,B)<->$F)))).
% 121.72/120.76  all B (range_31_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex31).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex30).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex29).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex28).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex27).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex26).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex25).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex24).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex23).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex22).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex21).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex20).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex19).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex18).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex17).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex16).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex15).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex14).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex13).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex12).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex11).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex10).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex9).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex8).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex7).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex6).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex5).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex4).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex3).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex2).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex1).
% 121.72/120.76  -b00000000000000000000000000000000(bitIndex0).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex5)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex6)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex7)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex8)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex9)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex10)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex11)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex12)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex13)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex14)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex15)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex16)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex17)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex18)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex19)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex20)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex21)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex22)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex23)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex24)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex25)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex26)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex27)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex28)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex29)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex30)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr (v1097(VarCurr,bitIndex31)<->v1098(VarCurr,bitIndex4)).
% 121.72/120.76  all VarCurr B (range_4_0(B)-> (v1097(VarCurr,B)<->v1098(VarCurr,B))).
% 121.72/120.76  all VarCurr (v1098(VarCurr,bitIndex0)<->v1126(VarCurr)).
% 121.72/120.76  all VarCurr (v1098(VarCurr,bitIndex1)<->v1124(VarCurr)).
% 121.72/120.76  all VarCurr (v1098(VarCurr,bitIndex2)<->v1120(VarCurr)).
% 121.72/120.76  all VarCurr (v1098(VarCurr,bitIndex3)<->v1116(VarCurr)).
% 121.72/120.76  all VarCurr (v1098(VarCurr,bitIndex4)<->v1100(VarCurr)).
% 121.72/120.76  all VarCurr (v1124(VarCurr)<->v1125(VarCurr)&v1127(VarCurr)).
% 121.72/120.76  all VarCurr (v1127(VarCurr)<->v1104(VarCurr,bitIndex0)|v1111(VarCurr)).
% 121.72/120.76  all VarCurr (v1125(VarCurr)<->v1126(VarCurr)|v1104(VarCurr,bitIndex1)).
% 121.72/120.76  all VarCurr (-v1126(VarCurr)<->v1104(VarCurr,bitIndex0)).
% 121.72/120.76  all VarCurr (v1120(VarCurr)<->v1121(VarCurr)&v1123(VarCurr)).
% 121.72/120.76  all VarCurr (v1123(VarCurr)<->v1109(VarCurr)|v1112(VarCurr)).
% 121.72/120.76  all VarCurr (v1121(VarCurr)<->v1122(VarCurr)|v1104(VarCurr,bitIndex2)).
% 121.72/120.76  all VarCurr (-v1122(VarCurr)<->v1109(VarCurr)).
% 121.72/120.76  all VarCurr (v1116(VarCurr)<->v1117(VarCurr)&v1119(VarCurr)).
% 121.72/120.76  all VarCurr (v1119(VarCurr)<->v1107(VarCurr)|v1113(VarCurr)).
% 121.72/120.77  all VarCurr (v1117(VarCurr)<->v1118(VarCurr)|v1104(VarCurr,bitIndex3)).
% 121.72/120.77  all VarCurr (-v1118(VarCurr)<->v1107(VarCurr)).
% 121.72/120.77  all VarCurr (v1100(VarCurr)<->v1101(VarCurr)&v1114(VarCurr)).
% 121.72/120.77  all VarCurr (v1114(VarCurr)<->v1103(VarCurr)|v1115(VarCurr)).
% 121.72/120.77  all VarCurr (-v1115(VarCurr)<->v1104(VarCurr,bitIndex4)).
% 121.72/120.77  all VarCurr (v1101(VarCurr)<->v1102(VarCurr)|v1104(VarCurr,bitIndex4)).
% 121.72/120.77  all VarCurr (-v1102(VarCurr)<->v1103(VarCurr)).
% 121.72/120.77  all VarCurr (v1103(VarCurr)<->v1104(VarCurr,bitIndex3)|v1106(VarCurr)).
% 121.72/120.77  all VarCurr (v1106(VarCurr)<->v1107(VarCurr)&v1113(VarCurr)).
% 121.72/120.77  all VarCurr (-v1113(VarCurr)<->v1104(VarCurr,bitIndex3)).
% 121.72/120.77  all VarCurr (v1107(VarCurr)<->v1104(VarCurr,bitIndex2)|v1108(VarCurr)).
% 121.72/120.77  all VarCurr (v1108(VarCurr)<->v1109(VarCurr)&v1112(VarCurr)).
% 121.72/120.77  all VarCurr (-v1112(VarCurr)<->v1104(VarCurr,bitIndex2)).
% 121.72/120.77  all VarCurr (v1109(VarCurr)<->v1104(VarCurr,bitIndex1)|v1110(VarCurr)).
% 121.72/120.77  all VarCurr (v1110(VarCurr)<->v1104(VarCurr,bitIndex0)&v1111(VarCurr)).
% 121.72/120.77  all VarCurr (-v1111(VarCurr)<->v1104(VarCurr,bitIndex1)).
% 121.72/120.77  all VarCurr (-v1104(VarCurr,bitIndex4)).
% 121.72/120.77  all VarCurr B (range_3_0(B)-> (v1104(VarCurr,B)<->v1084(VarCurr,B))).
% 121.72/120.77  all VarCurr (v1096(VarCurr)<-> (v1084(VarCurr,bitIndex3)<->$F)& (v1084(VarCurr,bitIndex2)<->$F)& (v1084(VarCurr,bitIndex1)<->$F)& (v1084(VarCurr,bitIndex0)<->$F)).
% 121.72/120.77  all VarCurr (v1093(VarCurr)<-> (v1094(VarCurr,bitIndex1)<->$F)& (v1094(VarCurr,bitIndex0)<->$T)).
% 121.72/120.77  all VarCurr (v1094(VarCurr,bitIndex0)<->v1049(VarCurr)).
% 121.72/120.77  all VarCurr (v1094(VarCurr,bitIndex1)<->v944(VarCurr)).
% 121.72/120.77  all B (range_3_0(B)-> (v1084(constB0,B)<->$F)).
% 121.72/120.77  all VarCurr (v1091(VarCurr)<-> (v1092(VarCurr,bitIndex1)<->$F)& (v1092(VarCurr,bitIndex0)<->$F)).
% 121.72/120.77  all VarCurr (v1092(VarCurr,bitIndex0)<->v1049(VarCurr)).
% 121.72/120.77  all VarCurr (v1092(VarCurr,bitIndex1)<->v944(VarCurr)).
% 121.72/120.77  all VarCurr B (range_7_0(B)-> (v823(VarCurr,B)<->v825(VarCurr,B))).
% 121.72/120.77  all VarCurr B (range_7_0(B)-> (v825(VarCurr,B)<->v827(VarCurr,B))).
% 121.72/120.77  all VarCurr B (range_7_0(B)-> (v827(VarCurr,B)<->v1043(VarCurr,B))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1068(VarNext)-> (all B (range_2_0(B)-> (v1045(VarNext,B)<->v1045(VarCurr,B)))))).
% 121.72/120.77  all VarNext (v1068(VarNext)-> (all B (range_2_0(B)-> (v1045(VarNext,B)<->v1076(VarNext,B))))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v1076(VarNext,B)<->v1074(VarCurr,B))))).
% 121.72/120.77  all VarCurr (-v1011(VarCurr)-> (all B (range_2_0(B)-> (v1074(VarCurr,B)<->v1047(VarCurr,B))))).
% 121.72/120.77  all VarCurr (v1011(VarCurr)-> (all B (range_2_0(B)-> (v1074(VarCurr,B)<->$F)))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1068(VarNext)<->v1069(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1069(VarNext)<->v1071(VarNext)&v831(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1071(VarNext)<->v1004(VarNext))).
% 121.72/120.77  all VarCurr (-v1049(VarCurr)-> (all B (range_2_0(B)-> (v1047(VarCurr,B)<->v1045(VarCurr,B))))).
% 121.72/120.77  all VarCurr (v1049(VarCurr)-> (all B (range_2_0(B)-> (v1047(VarCurr,B)<->v1051(VarCurr,B))))).
% 121.72/120.77  all VarCurr (-v1052(VarCurr)-> (all B (range_2_0(B)-> (v1051(VarCurr,B)<->v1053(VarCurr,B))))).
% 121.72/120.77  all VarCurr (v1052(VarCurr)-> (all B (range_2_0(B)-> (v1051(VarCurr,B)<->$F)))).
% 121.72/120.77  all VarCurr (v1053(VarCurr,bitIndex0)<->v1063(VarCurr)).
% 121.72/120.77  all VarCurr (v1053(VarCurr,bitIndex1)<->v1061(VarCurr)).
% 121.72/120.77  all VarCurr (v1053(VarCurr,bitIndex2)<->v1055(VarCurr)).
% 121.72/120.77  all VarCurr (v1061(VarCurr)<->v1062(VarCurr)&v1065(VarCurr)).
% 121.72/120.77  all VarCurr (v1065(VarCurr)<->v1045(VarCurr,bitIndex0)|v1045(VarCurr,bitIndex1)).
% 121.72/120.77  all VarCurr (v1062(VarCurr)<->v1063(VarCurr)|v1064(VarCurr)).
% 121.72/120.77  all VarCurr (-v1064(VarCurr)<->v1045(VarCurr,bitIndex1)).
% 121.72/120.77  all VarCurr (-v1063(VarCurr)<->v1045(VarCurr,bitIndex0)).
% 121.72/120.77  all VarCurr (v1055(VarCurr)<->v1056(VarCurr)&v1060(VarCurr)).
% 121.72/120.77  all VarCurr (v1060(VarCurr)<->v1058(VarCurr)|v1045(VarCurr,bitIndex2)).
% 121.72/120.77  all VarCurr (v1056(VarCurr)<->v1057(VarCurr)|v1059(VarCurr)).
% 121.72/120.77  all VarCurr (-v1059(VarCurr)<->v1045(VarCurr,bitIndex2)).
% 121.72/120.77  all VarCurr (-v1057(VarCurr)<->v1058(VarCurr)).
% 121.72/120.77  all VarCurr (v1058(VarCurr)<->v1045(VarCurr,bitIndex0)&v1045(VarCurr,bitIndex1)).
% 121.72/120.77  all VarCurr (v1052(VarCurr)<-> (v1045(VarCurr,bitIndex2)<->$T)& (v1045(VarCurr,bitIndex1)<->$T)& (v1045(VarCurr,bitIndex0)<->$T)).
% 121.72/120.77  all VarCurr (v1049(VarCurr)<->v677(VarCurr)).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v1045_range_2_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (address(A)-> (all B (A=AssociatedAddressVar-> (range_130_0(B)-> (v1043(VarNext,B)<->v829_array(VarNext,A,B)))))))))).
% 121.72/120.77  all B (range_2_0(B)-> (v1045(constB0,B)<->$F)).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (-v1035(VarNext)-> (all B (range_130_0(B)-> (v829_array(VarNext,A,B)<->v829_1__array(VarNext,A,B))))))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (v1035(VarNext)-> (all B (range_130_0(B)-> (v829_array(VarNext,A,B)<->b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(B))))))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1035(VarNext)<->v1036(VarNext)&v1041(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1041(VarNext)<->v1032(VarCurr))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1036(VarNext)<->v1038(VarNext)&v831(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1038(VarNext)<->v1004(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v977_range_2_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (-(A=AssociatedAddressVar&v1023(VarNext))-> (all B (range_130_0(B)-> (v829_1__array(VarNext,A,B)<->v829_array(VarCurr,A,B))))))))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v977_range_2_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (A=AssociatedAddressVar&v1023(VarNext)-> (all B (range_130_0(B)-> (v829_1__array(VarNext,A,B)<->v836(VarNext,B))))))))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1023(VarNext)<->v1024(VarNext)&v1030(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1030(VarNext)<->v1028(VarCurr))).
% 121.72/120.77  all VarCurr (v1028(VarCurr)<->v1031(VarCurr)&v944(VarCurr)).
% 121.72/120.77  all VarCurr (-v1031(VarCurr)<->v1032(VarCurr)).
% 121.72/120.77  all VarCurr (-v1032(VarCurr)<->v834(VarCurr)).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1024(VarNext)<->v1025(VarNext)&v831(VarNext))).
% 121.72/120.77  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1025(VarNext)<->v1004(VarNext))).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex0).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex1).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex2).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex3).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex4).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex5).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex6).
% 121.72/120.77  -v829_array(constB0,b111_address_term,bitIndex7).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex0).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex1).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex2).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex3).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex4).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex5).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex6).
% 121.72/120.77  -v829_array(constB0,b110_address_term,bitIndex7).
% 121.72/120.77  b110(bitIndex2).
% 121.72/120.77  b110(bitIndex1).
% 121.72/120.77  -b110(bitIndex0).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex0).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex1).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex2).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex3).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex4).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex5).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex6).
% 121.72/120.77  -v829_array(constB0,b101_address_term,bitIndex7).
% 121.72/120.77  b101(bitIndex2).
% 121.72/120.77  -b101(bitIndex1).
% 121.72/120.77  b101(bitIndex0).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex0).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex1).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex2).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex3).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex4).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex5).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex6).
% 121.72/120.77  -v829_array(constB0,b100_address_term,bitIndex7).
% 121.72/120.77  b100(bitIndex2).
% 121.72/120.77  -b100(bitIndex1).
% 121.72/120.77  -b100(bitIndex0).
% 121.72/120.77  -v829_array(constB0,b011_address_term,bitIndex0).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex1).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex2).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex3).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex4).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex5).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex6).
% 121.72/120.78  -v829_array(constB0,b011_address_term,bitIndex7).
% 121.72/120.78  -b011(bitIndex2).
% 121.72/120.78  b011(bitIndex1).
% 121.72/120.78  b011(bitIndex0).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex0).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex1).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex2).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex3).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex4).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex5).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex6).
% 121.72/120.78  -v829_array(constB0,b010_address_term,bitIndex7).
% 121.72/120.78  -b010(bitIndex2).
% 121.72/120.78  b010(bitIndex1).
% 121.72/120.78  -b010(bitIndex0).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex0).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex1).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex2).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex3).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex4).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex5).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex6).
% 121.72/120.78  -v829_array(constB0,b001_address_term,bitIndex7).
% 121.72/120.78  -b001(bitIndex2).
% 121.72/120.78  -b001(bitIndex1).
% 121.72/120.78  b001(bitIndex0).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex0).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex1).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex2).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex3).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex4).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex5).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex6).
% 121.72/120.78  -v829_array(constB0,b000_address_term,bitIndex7).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1000(VarNext)-> (all B (range_2_0(B)-> (v977(VarNext,B)<->v977(VarCurr,B)))))).
% 121.72/120.78  all VarNext (v1000(VarNext)-> (all B (range_2_0(B)-> (v977(VarNext,B)<->v1010(VarNext,B))))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v1010(VarNext,B)<->v1008(VarCurr,B))))).
% 121.72/120.78  all VarCurr (-v1011(VarCurr)-> (all B (range_2_0(B)-> (v1008(VarCurr,B)<->v979(VarCurr,B))))).
% 121.72/120.78  all VarCurr (v1011(VarCurr)-> (all B (range_2_0(B)-> (v1008(VarCurr,B)<->$F)))).
% 121.72/120.78  all VarCurr (-v1011(VarCurr)<->v834(VarCurr)).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1000(VarNext)<->v1001(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1001(VarNext)<->v1002(VarNext)&v831(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1002(VarNext)<->v1004(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1004(VarNext)<->v831(VarCurr))).
% 121.72/120.78  all VarCurr (-v944(VarCurr)-> (all B (range_2_0(B)-> (v979(VarCurr,B)<->v977(VarCurr,B))))).
% 121.72/120.78  all VarCurr (v944(VarCurr)-> (all B (range_2_0(B)-> (v979(VarCurr,B)<->v983(VarCurr,B))))).
% 121.72/120.78  all VarCurr (-v984(VarCurr)-> (all B (range_2_0(B)-> (v983(VarCurr,B)<->v985(VarCurr,B))))).
% 121.72/120.78  all VarCurr (v984(VarCurr)-> (all B (range_2_0(B)-> (v983(VarCurr,B)<->$F)))).
% 121.72/120.78  all VarCurr (v985(VarCurr,bitIndex0)<->v995(VarCurr)).
% 121.72/120.78  all VarCurr (v985(VarCurr,bitIndex1)<->v993(VarCurr)).
% 121.72/120.78  all VarCurr (v985(VarCurr,bitIndex2)<->v987(VarCurr)).
% 121.72/120.78  all VarCurr (v993(VarCurr)<->v994(VarCurr)&v997(VarCurr)).
% 121.72/120.78  all VarCurr (v997(VarCurr)<->v977(VarCurr,bitIndex0)|v977(VarCurr,bitIndex1)).
% 121.72/120.78  all VarCurr (v994(VarCurr)<->v995(VarCurr)|v996(VarCurr)).
% 121.72/120.78  all VarCurr (-v996(VarCurr)<->v977(VarCurr,bitIndex1)).
% 121.72/120.78  all VarCurr (-v995(VarCurr)<->v977(VarCurr,bitIndex0)).
% 121.72/120.78  all VarCurr (v987(VarCurr)<->v988(VarCurr)&v992(VarCurr)).
% 121.72/120.78  all VarCurr (v992(VarCurr)<->v990(VarCurr)|v977(VarCurr,bitIndex2)).
% 121.72/120.78  all VarCurr (v988(VarCurr)<->v989(VarCurr)|v991(VarCurr)).
% 121.72/120.78  all VarCurr (-v991(VarCurr)<->v977(VarCurr,bitIndex2)).
% 121.72/120.78  all VarCurr (-v989(VarCurr)<->v990(VarCurr)).
% 121.72/120.78  all VarCurr (v990(VarCurr)<->v977(VarCurr,bitIndex0)&v977(VarCurr,bitIndex1)).
% 121.72/120.78  all VarCurr (v984(VarCurr)<-> (v977(VarCurr,bitIndex2)<->$T)& (v977(VarCurr,bitIndex1)<->$T)& (v977(VarCurr,bitIndex0)<->$T)).
% 121.72/120.78  b111(bitIndex2).
% 121.72/120.78  b111(bitIndex1).
% 121.72/120.78  b111(bitIndex0).
% 121.72/120.78  all B (range_2_0(B)-> (v977(constB0,B)<->$F)).
% 121.72/120.78  all B (range_2_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B).
% 121.72/120.78  -b000(bitIndex2).
% 121.72/120.78  -b000(bitIndex1).
% 121.72/120.78  -b000(bitIndex0).
% 121.72/120.78  all VarCurr (v944(VarCurr)<->v946(VarCurr)).
% 121.72/120.78  all VarCurr (v946(VarCurr)<->v948(VarCurr)).
% 121.72/120.78  all VarCurr (v948(VarCurr)<->v950(VarCurr)).
% 121.72/120.78  all VarCurr (v950(VarCurr)<->v952(VarCurr)).
% 121.72/120.78  all VarCurr (v952(VarCurr)<->v954(VarCurr)).
% 121.72/120.78  all VarCurr (v954(VarCurr)<->v956(VarCurr)).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v965(VarNext)-> (v956(VarNext)<->v956(VarCurr)))).
% 121.72/120.78  all VarNext (v965(VarNext)-> (v956(VarNext)<->v973(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v973(VarNext)<->v971(VarCurr))).
% 121.72/120.78  all VarCurr (-v939(VarCurr)-> (v971(VarCurr)<->v958(VarCurr))).
% 121.72/120.78  all VarCurr (v939(VarCurr)-> (v971(VarCurr)<->$F)).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v965(VarNext)<->v966(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v966(VarNext)<->v968(VarNext)&v925(VarNext))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v968(VarNext)<->v932(VarNext))).
% 121.72/120.78  all VarCurr (v958(VarCurr)<->v960(VarCurr)).
% 121.72/120.78  all VarCurr (v960(VarCurr)<->v962(VarCurr)).
% 121.72/120.78  v962(constB0)<->$F.
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v836(VarCurr,B)<->v838(VarCurr,B))).
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v838(VarCurr,B)<->v840(VarCurr,B))).
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v840(VarCurr,B)<->v842(VarCurr,B))).
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v842(VarCurr,B)<->v844(VarCurr,B))).
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v844(VarCurr,B)<->v846(VarCurr,B))).
% 121.72/120.78  all VarCurr B (range_130_0(B)-> (v846(VarCurr,B)<->v848(VarCurr,B))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v928(VarNext)-> (all B (range_130_0(B)-> (v848(VarNext,B)<->v848(VarCurr,B)))))).
% 121.72/120.78  all VarNext (v928(VarNext)-> (all B (range_130_0(B)-> (v848(VarNext,B)<->v938(VarNext,B))))).
% 121.72/120.78  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_130_0(B)-> (v938(VarNext,B)<->v936(VarCurr,B))))).
% 121.72/120.78  all VarCurr (-v939(VarCurr)-> (all B (range_130_0(B)-> (v936(VarCurr,B)<->v940(VarCurr,B))))).
% 121.72/120.78  all VarCurr (v939(VarCurr)-> (all B (range_130_0(B)-> (v936(VarCurr,B)<->$F)))).
% 121.72/120.78  all B (range_130_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B|bitIndex67=B|bitIndex68=B|bitIndex69=B|bitIndex70=B|bitIndex71=B|bitIndex72=B|bitIndex73=B|bitIndex74=B|bitIndex75=B|bitIndex76=B|bitIndex77=B|bitIndex78=B|bitIndex79=B|bitIndex80=B|bitIndex81=B|bitIndex82=B|bitIndex83=B|bitIndex84=B|bitIndex85=B|bitIndex86=B|bitIndex87=B|bitIndex88=B|bitIndex89=B|bitIndex90=B|bitIndex91=B|bitIndex92=B|bitIndex93=B|bitIndex94=B|bitIndex95=B|bitIndex96=B|bitIndex97=B|bitIndex98=B|bitIndex99=B|bitIndex100=B|bitIndex101=B|bitIndex102=B|bitIndex103=B|bitIndex104=B|bitIndex105=B|bitIndex106=B|bitIndex107=B|bitIndex108=B|bitIndex109=B|bitIndex110=B|bitIndex111=B|bitIndex112=B|bitIndex113=B|bitIndex114=B|bitIndex115=B|bitIndex116=B|bitIndex117=B|bitIndex118=B|bitIndex119=B|bitIndex120=B|bitIndex121=B|bitIndex122=B|bitIndex123=B|bitIndex124=B|bitIndex125=B|bitIndex126=B|bitIndex127=B|bitIndex128=B|bitIndex129=B|bitIndex130=B).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex130).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex129).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex128).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex127).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex126).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex125).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex124).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex123).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex122).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex121).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex120).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex119).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex118).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex117).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex116).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex115).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex114).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex113).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex112).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex111).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex110).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex109).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex108).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex107).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex106).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex105).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex104).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex103).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex102).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex101).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex100).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex99).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex98).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex97).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex96).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex95).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex94).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex93).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex92).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex91).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex90).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex89).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex88).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex87).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex86).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex85).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex84).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex83).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex82).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex81).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex80).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex79).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex78).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex77).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex76).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex75).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex74).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex73).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex72).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex71).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex70).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex69).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex68).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex67).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex66).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex65).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex64).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 121.72/120.78  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 121.83/120.79  -b00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 121.83/120.79  all VarCurr B (range_14_0(B)-> (v940(VarCurr,B)<->v870(VarCurr,B))).
% 121.83/120.79  all VarCurr ((v940(VarCurr,bitIndex76)<->v897(VarCurr,bitIndex61))& (v940(VarCurr,bitIndex75)<->v897(VarCurr,bitIndex60))& (v940(VarCurr,bitIndex74)<->v897(VarCurr,bitIndex59))& (v940(VarCurr,bitIndex73)<->v897(VarCurr,bitIndex58))& (v940(VarCurr,bitIndex72)<->v897(VarCurr,bitIndex57))& (v940(VarCurr,bitIndex71)<->v897(VarCurr,bitIndex56))& (v940(VarCurr,bitIndex70)<->v897(VarCurr,bitIndex55))& (v940(VarCurr,bitIndex69)<->v897(VarCurr,bitIndex54))& (v940(VarCurr,bitIndex68)<->v897(VarCurr,bitIndex53))& (v940(VarCurr,bitIndex67)<->v897(VarCurr,bitIndex52))& (v940(VarCurr,bitIndex66)<->v897(VarCurr,bitIndex51))& (v940(VarCurr,bitIndex65)<->v897(VarCurr,bitIndex50))& (v940(VarCurr,bitIndex64)<->v897(VarCurr,bitIndex49))& (v940(VarCurr,bitIndex63)<->v897(VarCurr,bitIndex48))& (v940(VarCurr,bitIndex62)<->v897(VarCurr,bitIndex47))& (v940(VarCurr,bitIndex61)<->v897(VarCurr,bitIndex46))& (v940(VarCurr,bitIndex60)<->v897(VarCurr,bitIndex45))& (v940(VarCurr,bitIndex59)<->v897(VarCurr,bitIndex44))& (v940(VarCurr,bitIndex58)<->v897(VarCurr,bitIndex43))& (v940(VarCurr,bitIndex57)<->v897(VarCurr,bitIndex42))& (v940(VarCurr,bitIndex56)<->v897(VarCurr,bitIndex41))& (v940(VarCurr,bitIndex55)<->v897(VarCurr,bitIndex40))& (v940(VarCurr,bitIndex54)<->v897(VarCurr,bitIndex39))& (v940(VarCurr,bitIndex53)<->v897(VarCurr,bitIndex38))& (v940(VarCurr,bitIndex52)<->v897(VarCurr,bitIndex37))& (v940(VarCurr,bitIndex51)<->v897(VarCurr,bitIndex36))& (v940(VarCurr,bitIndex50)<->v897(VarCurr,bitIndex35))& (v940(VarCurr,bitIndex49)<->v897(VarCurr,bitIndex34))& (v940(VarCurr,bitIndex48)<->v897(VarCurr,bitIndex33))& (v940(VarCurr,bitIndex47)<->v897(VarCurr,bitIndex32))& (v940(VarCurr,bitIndex46)<->v897(VarCurr,bitIndex31))& (v940(VarCurr,bitIndex45)<->v897(VarCurr,bitIndex30))& (v940(VarCurr,bitIndex44)<->v897(VarCurr,bitIndex29))& (v940(VarCurr,bitIndex43)<->v897(VarCurr,bitIndex28))& (v940(VarCurr,bitIndex42)<->v897(VarCurr,bitIndex27))& (v940(VarCurr,bitIndex41)<->v897(VarCurr,bitIndex26))& (v940(VarCurr,bitIndex40)<->v897(VarCurr,bitIndex25))& (v940(VarCurr,bitIndex39)<->v897(VarCurr,bitIndex24))& (v940(VarCurr,bitIndex38)<->v897(VarCurr,bitIndex23))& (v940(VarCurr,bitIndex37)<->v897(VarCurr,bitIndex22))& (v940(VarCurr,bitIndex36)<->v897(VarCurr,bitIndex21))& (v940(VarCurr,bitIndex35)<->v897(VarCurr,bitIndex20))& (v940(VarCurr,bitIndex34)<->v897(VarCurr,bitIndex19))& (v940(VarCurr,bitIndex33)<->v897(VarCurr,bitIndex18))& (v940(VarCurr,bitIndex32)<->v897(VarCurr,bitIndex17))& (v940(VarCurr,bitIndex31)<->v897(VarCurr,bitIndex16))& (v940(VarCurr,bitIndex30)<->v897(VarCurr,bitIndex15))& (v940(VarCurr,bitIndex29)<->v897(VarCurr,bitIndex14))& (v940(VarCurr,bitIndex28)<->v897(VarCurr,bitIndex13))& (v940(VarCurr,bitIndex27)<->v897(VarCurr,bitIndex12))& (v940(VarCurr,bitIndex26)<->v897(VarCurr,bitIndex11))& (v940(VarCurr,bitIndex25)<->v897(VarCurr,bitIndex10))& (v940(VarCurr,bitIndex24)<->v897(VarCurr,bitIndex9))& (v940(VarCurr,bitIndex23)<->v897(VarCurr,bitIndex8))& (v940(VarCurr,bitIndex22)<->v897(VarCurr,bitIndex7))& (v940(VarCurr,bitIndex21)<->v897(VarCurr,bitIndex6))& (v940(VarCurr,bitIndex20)<->v897(VarCurr,bitIndex5))& (v940(VarCurr,bitIndex19)<->v897(VarCurr,bitIndex4))& (v940(VarCurr,bitIndex18)<->v897(VarCurr,bitIndex3))& (v940(VarCurr,bitIndex17)<->v897(VarCurr,bitIndex2))& (v940(VarCurr,bitIndex16)<->v897(VarCurr,bitIndex1))& (v940(VarCurr,bitIndex15)<->v897(VarCurr,bitIndex0))).
% 121.84/120.80  all VarCurr B (range_123_77(B)-> (v940(VarCurr,B)<->v870(VarCurr,B))).
% 121.84/120.80  all VarCurr ((v940(VarCurr,bitIndex130)<->v852(VarCurr,bitIndex6))& (v940(VarCurr,bitIndex129)<->v852(VarCurr,bitIndex5))& (v940(VarCurr,bitIndex128)<->v852(VarCurr,bitIndex4))& (v940(VarCurr,bitIndex127)<->v852(VarCurr,bitIndex3))& (v940(VarCurr,bitIndex126)<->v852(VarCurr,bitIndex2))& (v940(VarCurr,bitIndex125)<->v852(VarCurr,bitIndex1))& (v940(VarCurr,bitIndex124)<->v852(VarCurr,bitIndex0))).
% 121.84/120.80  all VarCurr (-v939(VarCurr)<->v850(VarCurr)).
% 121.84/120.80  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v928(VarNext)<->v929(VarNext))).
% 121.84/120.80  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v929(VarNext)<->v930(VarNext)&v925(VarNext))).
% 121.84/120.80  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v930(VarNext)<->v932(VarNext))).
% 121.84/120.80  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v932(VarNext)<->v925(VarCurr))).
% 121.84/120.80  all VarCurr (v925(VarCurr)<->v641(VarCurr)).
% 121.84/120.80  all VarCurr B (range_14_0(B)-> (v870(VarCurr,B)<->v872(VarCurr,B))).
% 121.84/120.80  all VarCurr B (range_14_0(B)-> (v872(VarCurr,B)<->v874(VarCurr,B))).
% 121.84/120.80  all B (range_14_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B).
% 121.84/120.80  all VarCurr (-v911(VarCurr)-> (v897(VarCurr,bitIndex61)<->v913(VarCurr))).
% 121.84/120.80  all VarCurr (v911(VarCurr)-> (v897(VarCurr,bitIndex61)<->$T)).
% 121.84/120.80  all VarCurr (-v914(VarCurr)-> (v913(VarCurr)<->v918(VarCurr))).
% 121.84/120.80  all VarCurr (v914(VarCurr)-> (v913(VarCurr)<->$T)).
% 121.84/120.80  all VarCurr (-v919(VarCurr)-> (v918(VarCurr)<->v923(VarCurr))).
% 121.84/120.80  all VarCurr (v919(VarCurr)-> (v918(VarCurr)<->v904(VarCurr,bitIndex61))).
% 121.84/120.80  all VarCurr (-v854(VarCurr)-> (v923(VarCurr)<->v904(VarCurr,bitIndex61))).
% 121.84/120.80  all VarCurr (v854(VarCurr)-> (v923(VarCurr)<->v889(VarCurr,bitIndex61))).
% 121.84/120.80  all VarCurr (v919(VarCurr)<->v920(VarCurr)&v922(VarCurr)).
% 121.84/120.80  all VarCurr (v922(VarCurr)<-> (v889(VarCurr,bitIndex61)<->$F)).
% 121.84/120.80  all VarCurr (v920(VarCurr)<->v921(VarCurr)&v866(VarCurr)).
% 121.84/120.80  all VarCurr (v921(VarCurr)<->v899(VarCurr)&v854(VarCurr)).
% 121.84/120.80  all VarCurr (v914(VarCurr)<->v915(VarCurr)&v917(VarCurr)).
% 121.84/120.80  all VarCurr (v917(VarCurr)<-> (v889(VarCurr,bitIndex61)<->$T)).
% 121.84/120.80  all VarCurr (v915(VarCurr)<->v916(VarCurr)&v866(VarCurr)).
% 121.84/120.80  all VarCurr (v916(VarCurr)<->v899(VarCurr)&v854(VarCurr)).
% 121.84/120.80  all VarCurr (v911(VarCurr)<->v912(VarCurr)&v882(VarCurr)).
% 121.84/120.80  all VarCurr (v912(VarCurr)<->v899(VarCurr)&v854(VarCurr)).
% 121.84/120.80  all VarCurr (-v906(VarCurr)-> (all B (range_60_0(B)-> (v897(VarCurr,B)<->v909(VarCurr,B))))).
% 121.84/120.80  all VarCurr (v906(VarCurr)-> (all B (range_60_0(B)-> (v897(VarCurr,B)<->v908(VarCurr,B))))).
% 121.84/120.80  all VarCurr (-v854(VarCurr)-> (all B (range_60_0(B)-> (v909(VarCurr,B)<->v904(VarCurr,B))))).
% 121.84/120.80  all VarCurr (v854(VarCurr)-> (all B (range_60_0(B)-> (v909(VarCurr,B)<->v889(VarCurr,B))))).
% 121.84/120.80  all B (range_60_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B).
% 121.84/120.80  all VarCurr B (range_36_0(B)-> (v908(VarCurr,B)<->v889(VarCurr,B))).
% 121.84/120.80  all VarCurr ((v908(VarCurr,bitIndex60)<->$T)& (v908(VarCurr,bitIndex59)<->$T)& (v908(VarCurr,bitIndex58)<->$T)& (v908(VarCurr,bitIndex57)<->$T)& (v908(VarCurr,bitIndex56)<->$T)& (v908(VarCurr,bitIndex55)<->$T)& (v908(VarCurr,bitIndex54)<->$T)& (v908(VarCurr,bitIndex53)<->$T)& (v908(VarCurr,bitIndex52)<->$T)& (v908(VarCurr,bitIndex51)<->$T)& (v908(VarCurr,bitIndex50)<->$T)& (v908(VarCurr,bitIndex49)<->$T)& (v908(VarCurr,bitIndex48)<->$T)& (v908(VarCurr,bitIndex47)<->$F)& (v908(VarCurr,bitIndex46)<->$F)& (v908(VarCurr,bitIndex45)<->$F)& (v908(VarCurr,bitIndex44)<->$F)& (v908(VarCurr,bitIndex43)<->$F)& (v908(VarCurr,bitIndex42)<->$F)& (v908(VarCurr,bitIndex41)<->$F)& (v908(VarCurr,bitIndex40)<->$F)& (v908(VarCurr,bitIndex39)<->$F)& (v908(VarCurr,bitIndex38)<->$F)& (v908(VarCurr,bitIndex37)<->$F)).
% 121.84/120.80  b111111111111100000000000(bitIndex23).
% 121.84/120.80  b111111111111100000000000(bitIndex22).
% 121.84/120.80  b111111111111100000000000(bitIndex21).
% 121.84/120.80  b111111111111100000000000(bitIndex20).
% 121.84/120.80  b111111111111100000000000(bitIndex19).
% 121.84/120.80  b111111111111100000000000(bitIndex18).
% 121.84/120.80  b111111111111100000000000(bitIndex17).
% 121.84/120.80  b111111111111100000000000(bitIndex16).
% 121.84/120.80  b111111111111100000000000(bitIndex15).
% 121.84/120.80  b111111111111100000000000(bitIndex14).
% 121.84/120.80  b111111111111100000000000(bitIndex13).
% 121.84/120.80  b111111111111100000000000(bitIndex12).
% 121.84/120.80  b111111111111100000000000(bitIndex11).
% 121.84/120.80  -b111111111111100000000000(bitIndex10).
% 121.84/120.80  -b111111111111100000000000(bitIndex9).
% 121.84/120.80  -b111111111111100000000000(bitIndex8).
% 121.84/120.80  -b111111111111100000000000(bitIndex7).
% 121.84/120.80  -b111111111111100000000000(bitIndex6).
% 121.84/120.80  -b111111111111100000000000(bitIndex5).
% 121.84/120.80  -b111111111111100000000000(bitIndex4).
% 121.84/120.80  -b111111111111100000000000(bitIndex3).
% 121.84/120.80  -b111111111111100000000000(bitIndex2).
% 121.84/120.80  -b111111111111100000000000(bitIndex1).
% 121.84/120.80  -b111111111111100000000000(bitIndex0).
% 121.84/120.80  all VarCurr (v906(VarCurr)<->v907(VarCurr)&v882(VarCurr)).
% 121.84/120.80  all VarCurr (v907(VarCurr)<->v899(VarCurr)&v854(VarCurr)).
% 121.84/120.80  all VarCurr (v904(VarCurr,bitIndex61)<->v870(VarCurr,bitIndex76)).
% 121.84/120.80  all VarCurr (v870(VarCurr,bitIndex76)<->v872(VarCurr,bitIndex76)).
% 121.84/120.80  all VarCurr (v872(VarCurr,bitIndex76)<->v874(VarCurr,bitIndex76)).
% 121.84/120.80  all VarCurr ((v904(VarCurr,bitIndex60)<->v870(VarCurr,bitIndex75))& (v904(VarCurr,bitIndex59)<->v870(VarCurr,bitIndex74))& (v904(VarCurr,bitIndex58)<->v870(VarCurr,bitIndex73))& (v904(VarCurr,bitIndex57)<->v870(VarCurr,bitIndex72))& (v904(VarCurr,bitIndex56)<->v870(VarCurr,bitIndex71))& (v904(VarCurr,bitIndex55)<->v870(VarCurr,bitIndex70))& (v904(VarCurr,bitIndex54)<->v870(VarCurr,bitIndex69))& (v904(VarCurr,bitIndex53)<->v870(VarCurr,bitIndex68))& (v904(VarCurr,bitIndex52)<->v870(VarCurr,bitIndex67))& (v904(VarCurr,bitIndex51)<->v870(VarCurr,bitIndex66))& (v904(VarCurr,bitIndex50)<->v870(VarCurr,bitIndex65))& (v904(VarCurr,bitIndex49)<->v870(VarCurr,bitIndex64))& (v904(VarCurr,bitIndex48)<->v870(VarCurr,bitIndex63))& (v904(VarCurr,bitIndex47)<->v870(VarCurr,bitIndex62))& (v904(VarCurr,bitIndex46)<->v870(VarCurr,bitIndex61))& (v904(VarCurr,bitIndex45)<->v870(VarCurr,bitIndex60))& (v904(VarCurr,bitIndex44)<->v870(VarCurr,bitIndex59))& (v904(VarCurr,bitIndex43)<->v870(VarCurr,bitIndex58))& (v904(VarCurr,bitIndex42)<->v870(VarCurr,bitIndex57))& (v904(VarCurr,bitIndex41)<->v870(VarCurr,bitIndex56))& (v904(VarCurr,bitIndex40)<->v870(VarCurr,bitIndex55))& (v904(VarCurr,bitIndex39)<->v870(VarCurr,bitIndex54))& (v904(VarCurr,bitIndex38)<->v870(VarCurr,bitIndex53))& (v904(VarCurr,bitIndex37)<->v870(VarCurr,bitIndex52))& (v904(VarCurr,bitIndex36)<->v870(VarCurr,bitIndex51))& (v904(VarCurr,bitIndex35)<->v870(VarCurr,bitIndex50))& (v904(VarCurr,bitIndex34)<->v870(VarCurr,bitIndex49))& (v904(VarCurr,bitIndex33)<->v870(VarCurr,bitIndex48))& (v904(VarCurr,bitIndex32)<->v870(VarCurr,bitIndex47))& (v904(VarCurr,bitIndex31)<->v870(VarCurr,bitIndex46))& (v904(VarCurr,bitIndex30)<->v870(VarCurr,bitIndex45))& (v904(VarCurr,bitIndex29)<->v870(VarCurr,bitIndex44))& (v904(VarCurr,bitIndex28)<->v870(VarCurr,bitIndex43))& (v904(VarCurr,bitIndex27)<->v870(VarCurr,bitIndex42))& (v904(VarCurr,bitIndex26)<->v870(VarCurr,bitIndex41))& (v904(VarCurr,bitIndex25)<->v870(VarCurr,bitIndex40))& (v904(VarCurr,bitIndex24)<->v870(VarCurr,bitIndex39))& (v904(VarCurr,bitIndex23)<->v870(VarCurr,bitIndex38))& (v904(VarCurr,bitIndex22)<->v870(VarCurr,bitIndex37))& (v904(VarCurr,bitIndex21)<->v870(VarCurr,bitIndex36))& (v904(VarCurr,bitIndex20)<->v870(VarCurr,bitIndex35))& (v904(VarCurr,bitIndex19)<->v870(VarCurr,bitIndex34))& (v904(VarCurr,bitIndex18)<->v870(VarCurr,bitIndex33))& (v904(VarCurr,bitIndex17)<->v870(VarCurr,bitIndex32))& (v904(VarCurr,bitIndex16)<->v870(VarCurr,bitIndex31))& (v904(VarCurr,bitIndex15)<->v870(VarCurr,bitIndex30))& (v904(VarCurr,bitIndex14)<->v870(VarCurr,bitIndex29))& (v904(VarCurr,bitIndex13)<->v870(VarCurr,bitIndex28))& (v904(VarCurr,bitIndex12)<->v870(VarCurr,bitIndex27))& (v904(VarCurr,bitIndex11)<->v870(VarCurr,bitIndex26))& (v904(VarCurr,bitIndex10)<->v870(VarCurr,bitIndex25))& (v904(VarCurr,bitIndex9)<->v870(VarCurr,bitIndex24))& (v904(VarCurr,bitIndex8)<->v870(VarCurr,bitIndex23))& (v904(VarCurr,bitIndex7)<->v870(VarCurr,bitIndex22))& (v904(VarCurr,bitIndex6)<->v870(VarCurr,bitIndex21))& (v904(VarCurr,bitIndex5)<->v870(VarCurr,bitIndex20))& (v904(VarCurr,bitIndex4)<->v870(VarCurr,bitIndex19))& (v904(VarCurr,bitIndex3)<->v870(VarCurr,bitIndex18))& (v904(VarCurr,bitIndex2)<->v870(VarCurr,bitIndex17))& (v904(VarCurr,bitIndex1)<->v870(VarCurr,bitIndex16))& (v904(VarCurr,bitIndex0)<->v870(VarCurr,bitIndex15))).
% 121.84/120.80  all VarCurr B (range_75_15(B)-> (v870(VarCurr,B)<->v872(VarCurr,B))).
% 121.84/120.80  all VarCurr B (range_75_15(B)-> (v872(VarCurr,B)<->v874(VarCurr,B))).
% 121.84/120.80  all B (range_75_15(B)<->bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B|bitIndex67=B|bitIndex68=B|bitIndex69=B|bitIndex70=B|bitIndex71=B|bitIndex72=B|bitIndex73=B|bitIndex74=B|bitIndex75=B).
% 121.84/120.80  all VarCurr B (range_60_37(B)-> (v889(VarCurr,B)<->v891(VarCurr,B))).
% 121.84/120.80  all VarCurr B (range_60_37(B)-> (v891(VarCurr,B)<->v893(VarCurr,B))).
% 121.84/120.80  all B (range_60_37(B)<->bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B).
% 121.85/120.81  all VarCurr B (range_36_0(B)-> (v889(VarCurr,B)<->v891(VarCurr,B))).
% 121.85/120.81  all VarCurr B (range_36_0(B)-> (v891(VarCurr,B)<->v893(VarCurr,B))).
% 121.85/120.81  all B (range_36_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B).
% 121.85/120.81  all VarCurr (v899(VarCurr)<->v901(VarCurr)).
% 121.85/120.81  all VarCurr (v901(VarCurr)<->v181(VarCurr)).
% 121.85/120.81  all VarCurr B (range_123_77(B)-> (v870(VarCurr,B)<->v872(VarCurr,B))).
% 121.85/120.81  all VarCurr B (range_123_77(B)-> (v872(VarCurr,B)<->v874(VarCurr,B))).
% 121.85/120.81  all B (range_123_77(B)<->bitIndex77=B|bitIndex78=B|bitIndex79=B|bitIndex80=B|bitIndex81=B|bitIndex82=B|bitIndex83=B|bitIndex84=B|bitIndex85=B|bitIndex86=B|bitIndex87=B|bitIndex88=B|bitIndex89=B|bitIndex90=B|bitIndex91=B|bitIndex92=B|bitIndex93=B|bitIndex94=B|bitIndex95=B|bitIndex96=B|bitIndex97=B|bitIndex98=B|bitIndex99=B|bitIndex100=B|bitIndex101=B|bitIndex102=B|bitIndex103=B|bitIndex104=B|bitIndex105=B|bitIndex106=B|bitIndex107=B|bitIndex108=B|bitIndex109=B|bitIndex110=B|bitIndex111=B|bitIndex112=B|bitIndex113=B|bitIndex114=B|bitIndex115=B|bitIndex116=B|bitIndex117=B|bitIndex118=B|bitIndex119=B|bitIndex120=B|bitIndex121=B|bitIndex122=B|bitIndex123=B).
% 121.85/120.81  all VarCurr (-v854(VarCurr)-> (all B (range_6_0(B)-> (v852(VarCurr,B)<->v868(VarCurr,B))))).
% 121.85/120.81  all VarCurr (v854(VarCurr)-> (all B (range_6_0(B)-> (v852(VarCurr,B)<->v895(VarCurr,B))))).
% 121.85/120.81  all B (range_6_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B).
% 121.85/120.81  all VarCurr B (range_4_0(B)-> (v895(VarCurr,B)<->v868(VarCurr,B))).
% 121.85/120.81  all VarCurr (v895(VarCurr,bitIndex5)<->v887(VarCurr)).
% 121.85/120.81  all VarCurr (v895(VarCurr,bitIndex6)<->v868(VarCurr,bitIndex6)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex5)<->v870(VarCurr,bitIndex129)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex129)<->v872(VarCurr,bitIndex129)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex129)<->v874(VarCurr,bitIndex129)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex4)<->v870(VarCurr,bitIndex128)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex0)<->v870(VarCurr,bitIndex124)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex128)<->v872(VarCurr,bitIndex128)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex128)<->v874(VarCurr,bitIndex128)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex124)<->v872(VarCurr,bitIndex124)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex124)<->v874(VarCurr,bitIndex124)).
% 121.85/120.81  all VarCurr (v887(VarCurr)<->v889(VarCurr,bitIndex61)).
% 121.85/120.81  all VarCurr (v889(VarCurr,bitIndex61)<->v891(VarCurr,bitIndex61)).
% 121.85/120.81  all VarCurr (v891(VarCurr,bitIndex61)<->v893(VarCurr,bitIndex61)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex6)<->v870(VarCurr,bitIndex130)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex130)<->v872(VarCurr,bitIndex130)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex130)<->v874(VarCurr,bitIndex130)).
% 121.85/120.81  all VarCurr (v854(VarCurr)<->v856(VarCurr)&v864(VarCurr)).
% 121.85/120.81  all VarCurr (v864(VarCurr)<->v866(VarCurr)|v882(VarCurr)).
% 121.85/120.81  all VarCurr (-v882(VarCurr)<->v868(VarCurr,bitIndex3)).
% 121.85/120.81  all VarCurr (v866(VarCurr)<->v878(VarCurr)&v880(VarCurr)).
% 121.85/120.81  all VarCurr (-v880(VarCurr)<->v868(VarCurr,bitIndex1)).
% 121.85/120.81  all VarCurr (v878(VarCurr)<->v868(VarCurr,bitIndex3)&v879(VarCurr)).
% 121.85/120.81  all VarCurr (-v879(VarCurr)<->v868(VarCurr,bitIndex2)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex1)<->v870(VarCurr,bitIndex125)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex125)<->v872(VarCurr,bitIndex125)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex125)<->v874(VarCurr,bitIndex125)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex2)<->v870(VarCurr,bitIndex126)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex126)<->v872(VarCurr,bitIndex126)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex126)<->v874(VarCurr,bitIndex126)).
% 121.85/120.81  all VarCurr (v868(VarCurr,bitIndex3)<->v870(VarCurr,bitIndex127)).
% 121.85/120.81  all VarCurr (v870(VarCurr,bitIndex127)<->v872(VarCurr,bitIndex127)).
% 121.85/120.81  all VarCurr (v872(VarCurr,bitIndex127)<->v874(VarCurr,bitIndex127)).
% 121.85/120.81  -v874(constB0,bitIndex7).
% 121.85/120.81  -v874(constB0,bitIndex6).
% 121.85/120.81  -v874(constB0,bitIndex5).
% 121.85/120.81  -v874(constB0,bitIndex4).
% 121.85/120.81  -v874(constB0,bitIndex3).
% 121.85/120.81  -v874(constB0,bitIndex2).
% 121.85/120.81  -v874(constB0,bitIndex1).
% 121.85/120.81  -v874(constB0,bitIndex0).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex7).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex6).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex5).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex4).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex3).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex2).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex1).
% 121.85/120.81  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00000000(bitIndex0).
% 121.85/120.81  all VarCurr (v856(VarCurr)<->v858(VarCurr)).
% 121.85/120.81  all VarCurr (v858(VarCurr)<->v860(VarCurr)).
% 121.85/120.81  all VarCurr (v860(VarCurr)<->v862(VarCurr)).
% 121.85/120.81  all VarCurr (v850(VarCurr)<->v593(VarCurr)).
% 121.85/120.81  all VarCurr (v834(VarCurr)<->v743(VarCurr)).
% 121.85/120.81  all VarCurr (v831(VarCurr)<->v801(VarCurr)).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v808(VarNext)-> (all B (range_7_0(B)-> (v749(VarNext,B)<->v749(VarCurr,B)))))).
% 121.85/120.81  all VarNext (v808(VarNext)-> (all B (range_7_0(B)-> (v749(VarNext,B)<->v818(VarNext,B))))).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_7_0(B)-> (v818(VarNext,B)<->v816(VarCurr,B))))).
% 121.85/120.81  all VarCurr (-v819(VarCurr)-> (all B (range_7_0(B)-> (v816(VarCurr,B)<->v752(VarCurr,B))))).
% 121.85/120.81  all VarCurr (v819(VarCurr)-> (all B (range_7_0(B)-> (v816(VarCurr,B)<->$F)))).
% 121.85/120.81  all VarCurr (-v819(VarCurr)<->v743(VarCurr)).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v808(VarNext)<->v809(VarNext))).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v809(VarNext)<->v810(VarNext)&v801(VarNext))).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v810(VarNext)<->v812(VarNext))).
% 121.85/120.81  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v812(VarNext)<->v801(VarCurr))).
% 121.85/120.81  all VarCurr (v801(VarCurr)<->v803(VarCurr)).
% 121.85/120.81  all VarCurr (v803(VarCurr)<->v805(VarCurr)).
% 121.85/120.81  all VarCurr (v805(VarCurr)<->v1(VarCurr)).
% 121.85/120.81  all VarCurr (-v745(VarCurr)-> (all B (range_7_0(B)-> (v752(VarCurr,B)<->v749(VarCurr,B))))).
% 121.85/120.81  all VarCurr (v745(VarCurr)-> (all B (range_7_0(B)-> (v752(VarCurr,B)<->v754(VarCurr,B))))).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex0)<->v796(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex1)<->v794(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex2)<->v789(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex3)<->v784(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex4)<->v779(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex5)<->v774(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex6)<->v769(VarCurr)).
% 121.85/120.81  all VarCurr (v754(VarCurr,bitIndex7)<->v758(VarCurr)).
% 121.85/120.81  all VarCurr (v794(VarCurr)<->v795(VarCurr)&v798(VarCurr)).
% 121.85/120.81  all VarCurr (v798(VarCurr)<->v749(VarCurr,bitIndex0)|v749(VarCurr,bitIndex1)).
% 121.85/120.81  all VarCurr (v795(VarCurr)<->v796(VarCurr)|v797(VarCurr)).
% 121.85/120.81  all VarCurr (-v797(VarCurr)<->v749(VarCurr,bitIndex1)).
% 121.85/120.81  all VarCurr (-v796(VarCurr)<->v749(VarCurr,bitIndex0)).
% 121.85/120.81  all VarCurr (v789(VarCurr)<->v790(VarCurr)&v793(VarCurr)).
% 121.85/120.81  all VarCurr (v793(VarCurr)<->v766(VarCurr)|v749(VarCurr,bitIndex2)).
% 121.85/120.81  all VarCurr (v790(VarCurr)<->v791(VarCurr)|v792(VarCurr)).
% 121.85/120.81  all VarCurr (-v792(VarCurr)<->v749(VarCurr,bitIndex2)).
% 121.85/120.81  all VarCurr (-v791(VarCurr)<->v766(VarCurr)).
% 121.85/120.81  all VarCurr (v784(VarCurr)<->v785(VarCurr)&v788(VarCurr)).
% 121.85/120.82  all VarCurr (v788(VarCurr)<->v765(VarCurr)|v749(VarCurr,bitIndex3)).
% 121.85/120.82  all VarCurr (v785(VarCurr)<->v786(VarCurr)|v787(VarCurr)).
% 121.85/120.82  all VarCurr (-v787(VarCurr)<->v749(VarCurr,bitIndex3)).
% 121.85/120.82  all VarCurr (-v786(VarCurr)<->v765(VarCurr)).
% 121.85/120.82  all VarCurr (v779(VarCurr)<->v780(VarCurr)&v783(VarCurr)).
% 121.85/120.82  all VarCurr (v783(VarCurr)<->v764(VarCurr)|v749(VarCurr,bitIndex4)).
% 121.85/120.82  all VarCurr (v780(VarCurr)<->v781(VarCurr)|v782(VarCurr)).
% 121.85/120.82  all VarCurr (-v782(VarCurr)<->v749(VarCurr,bitIndex4)).
% 121.85/120.82  all VarCurr (-v781(VarCurr)<->v764(VarCurr)).
% 121.85/120.82  all VarCurr (v774(VarCurr)<->v775(VarCurr)&v778(VarCurr)).
% 121.85/120.82  all VarCurr (v778(VarCurr)<->v763(VarCurr)|v749(VarCurr,bitIndex5)).
% 121.85/120.82  all VarCurr (v775(VarCurr)<->v776(VarCurr)|v777(VarCurr)).
% 121.85/120.82  all VarCurr (-v777(VarCurr)<->v749(VarCurr,bitIndex5)).
% 121.85/120.82  all VarCurr (-v776(VarCurr)<->v763(VarCurr)).
% 121.85/120.82  all VarCurr (v769(VarCurr)<->v770(VarCurr)&v773(VarCurr)).
% 121.85/120.82  all VarCurr (v773(VarCurr)<->v762(VarCurr)|v749(VarCurr,bitIndex6)).
% 121.85/120.82  all VarCurr (v770(VarCurr)<->v771(VarCurr)|v772(VarCurr)).
% 121.85/120.82  all VarCurr (-v772(VarCurr)<->v749(VarCurr,bitIndex6)).
% 121.85/120.82  all VarCurr (-v771(VarCurr)<->v762(VarCurr)).
% 121.85/120.82  all VarCurr (v758(VarCurr)<->v759(VarCurr)&v768(VarCurr)).
% 121.85/120.82  all VarCurr (v768(VarCurr)<->v761(VarCurr)|v749(VarCurr,bitIndex7)).
% 121.85/120.82  all VarCurr (v759(VarCurr)<->v760(VarCurr)|v767(VarCurr)).
% 121.85/120.82  all VarCurr (-v767(VarCurr)<->v749(VarCurr,bitIndex7)).
% 121.85/120.82  all VarCurr (-v760(VarCurr)<->v761(VarCurr)).
% 121.85/120.82  all VarCurr (v761(VarCurr)<->v762(VarCurr)&v749(VarCurr,bitIndex6)).
% 121.85/120.82  all VarCurr (v762(VarCurr)<->v763(VarCurr)&v749(VarCurr,bitIndex5)).
% 121.85/120.82  all VarCurr (v763(VarCurr)<->v764(VarCurr)&v749(VarCurr,bitIndex4)).
% 121.85/120.82  all VarCurr (v764(VarCurr)<->v765(VarCurr)&v749(VarCurr,bitIndex3)).
% 121.85/120.82  all VarCurr (v765(VarCurr)<->v766(VarCurr)&v749(VarCurr,bitIndex2)).
% 121.85/120.82  all VarCurr (v766(VarCurr)<->v749(VarCurr,bitIndex0)&v749(VarCurr,bitIndex1)).
% 121.85/120.82  all B (range_7_0(B)-> (v749(constB0,B)<->$F)).
% 121.85/120.82  all B (range_7_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B).
% 121.85/120.82  -b00000000(bitIndex7).
% 121.85/120.82  -b00000000(bitIndex6).
% 121.85/120.82  -b00000000(bitIndex5).
% 121.85/120.82  -b00000000(bitIndex4).
% 121.85/120.82  -b00000000(bitIndex3).
% 121.85/120.82  -b00000000(bitIndex2).
% 121.85/120.82  -b00000000(bitIndex1).
% 121.85/120.82  -b00000000(bitIndex0).
% 121.85/120.82  all VarCurr (v743(VarCurr)<->v10(VarCurr)).
% 121.85/120.82  all VarCurr (v709(VarCurr)<->v711(VarCurr)).
% 121.85/120.82  all VarCurr (v711(VarCurr)<->v713(VarCurr)).
% 121.85/120.82  all VarCurr (v713(VarCurr)<->v14(VarCurr)).
% 121.85/120.82  all VarCurr (v571(VarCurr)<->v573(VarCurr)).
% 121.85/120.82  all VarCurr (v573(VarCurr)<->v575(VarCurr)).
% 121.85/120.82  all VarCurr (v575(VarCurr)<->v577(VarCurr)).
% 121.85/120.82  all VarCurr (v577(VarCurr)<->v579(VarCurr)).
% 121.85/120.82  all VarCurr (v579(VarCurr)<->v581(VarCurr)).
% 121.85/120.82  all VarCurr (v581(VarCurr)<->v583(VarCurr)).
% 121.85/120.82  all VarCurr (v583(VarCurr)<->v585(VarCurr)).
% 121.85/120.82  all VarCurr (v585(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$T)& (v587(VarCurr,bitIndex0)<->$T)).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v644(VarNext)-> (all B (range_1_0(B)-> (v587(VarNext,B)<->v587(VarCurr,B)))))).
% 121.85/120.82  all VarNext (v644(VarNext)-> (all B (range_1_0(B)-> (v587(VarNext,B)<->v654(VarNext,B))))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_1_0(B)-> (v654(VarNext,B)<->v652(VarCurr,B))))).
% 121.85/120.82  all VarCurr (-v655(VarCurr)-> (all B (range_1_0(B)-> (v652(VarCurr,B)<->v595(VarCurr,B))))).
% 121.85/120.82  all VarCurr (v655(VarCurr)-> (all B (range_1_0(B)-> (v652(VarCurr,B)<->$F)))).
% 121.85/120.82  all VarCurr (-v655(VarCurr)<->v589(VarCurr)).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v644(VarNext)<->v645(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v645(VarNext)<->v646(VarNext)&v637(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v646(VarNext)<->v648(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v648(VarNext)<->v637(VarCurr))).
% 121.85/120.82  all VarCurr (v637(VarCurr)<->v639(VarCurr)).
% 121.85/120.82  all VarCurr (v639(VarCurr)<->v641(VarCurr)).
% 121.85/120.82  all VarCurr (v641(VarCurr)<->v1(VarCurr)).
% 121.85/120.82  all VarCurr (-v629(VarCurr)& -v631(VarCurr)& -v632(VarCurr)-> (all B (range_1_0(B)-> (v595(VarCurr,B)<->v635(VarCurr,B))))).
% 121.85/120.82  all VarCurr (v632(VarCurr)-> (all B (range_1_0(B)-> (v595(VarCurr,B)<->v633(VarCurr,B))))).
% 121.85/120.82  all VarCurr (v631(VarCurr)-> (all B (range_1_0(B)-> (v595(VarCurr,B)<->b10(B))))).
% 121.85/120.82  all VarCurr (v629(VarCurr)-> (all B (range_1_0(B)-> (v595(VarCurr,B)<->v630(VarCurr,B))))).
% 121.85/120.82  all VarCurr (-v597(VarCurr)-> (all B (range_1_0(B)-> (v635(VarCurr,B)<->$F)))).
% 121.85/120.82  all VarCurr (v597(VarCurr)-> (all B (range_1_0(B)-> (v635(VarCurr,B)<->b01(B))))).
% 121.85/120.82  all VarCurr (v634(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$T)& (v587(VarCurr,bitIndex0)<->$T)).
% 121.85/120.82  all VarCurr (-v606(VarCurr)-> (all B (range_1_0(B)-> (v633(VarCurr,B)<->b10(B))))).
% 121.85/120.82  all VarCurr (v606(VarCurr)-> (all B (range_1_0(B)-> (v633(VarCurr,B)<->$T)))).
% 121.85/120.82  b11(bitIndex1).
% 121.85/120.82  b11(bitIndex0).
% 121.85/120.82  all VarCurr (v632(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$T)& (v587(VarCurr,bitIndex0)<->$F)).
% 121.85/120.82  all VarCurr (v631(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$F)& (v587(VarCurr,bitIndex0)<->$T)).
% 121.85/120.82  all VarCurr (-v597(VarCurr)-> (all B (range_1_0(B)-> (v630(VarCurr,B)<->$F)))).
% 121.85/120.82  all VarCurr (v597(VarCurr)-> (all B (range_1_0(B)-> (v630(VarCurr,B)<->b01(B))))).
% 121.85/120.82  all VarCurr (v629(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$F)& (v587(VarCurr,bitIndex0)<->$F)).
% 121.85/120.82  all VarCurr (v606(VarCurr)<->v608(VarCurr)).
% 121.85/120.82  all VarCurr (v608(VarCurr)<->v610(VarCurr)).
% 121.85/120.82  all VarCurr (v610(VarCurr)<->v612(VarCurr)).
% 121.85/120.82  all VarCurr (v612(VarCurr)<->v614(VarCurr)&v625(VarCurr)).
% 121.85/120.82  v625(constB0)<->$F.
% 121.85/120.82  all VarCurr (v614(VarCurr)<->v616(VarCurr)).
% 121.85/120.82  all VarCurr (v616(VarCurr)<->v618(VarCurr)).
% 121.85/120.82  all VarCurr (v618(VarCurr)<->v620(VarCurr)).
% 121.85/120.82  all VarCurr (v620(VarCurr)<->v622(VarCurr)|v623(VarCurr)).
% 121.85/120.82  all VarCurr (v623(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$T)& (v587(VarCurr,bitIndex0)<->$F)).
% 121.85/120.82  all VarCurr (v622(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$F)& (v587(VarCurr,bitIndex0)<->$T)).
% 121.85/120.82  -b01(bitIndex1).
% 121.85/120.82  b01(bitIndex0).
% 121.85/120.82  all B (range_1_0(B)-> (v587(constB0,B)<->$F)).
% 121.85/120.82  all VarCurr (v597(VarCurr)<->v599(VarCurr)).
% 121.85/120.82  all VarCurr (v599(VarCurr)<->v601(VarCurr)).
% 121.85/120.82  all VarCurr (v601(VarCurr)<-> (v603(VarCurr,bitIndex1)<->$T)& (v603(VarCurr,bitIndex0)<->$F)).
% 121.85/120.82  b10(bitIndex1).
% 121.85/120.82  -b10(bitIndex0).
% 121.85/120.82  all B (range_1_0(B)-> (v603(constB0,B)<->$F)).
% 121.85/120.82  all VarCurr (v589(VarCurr)<->v591(VarCurr)).
% 121.85/120.82  all VarCurr (v591(VarCurr)<->v593(VarCurr)).
% 121.85/120.82  all VarCurr (v593(VarCurr)<->v14(VarCurr)).
% 121.85/120.82  all VarCurr (v566(VarCurr)<->v45(VarCurr)).
% 121.85/120.82  all VarCurr (v47(VarCurr)<->v49(VarCurr)).
% 121.85/120.82  all VarCurr (v49(VarCurr)<->v51(VarCurr)).
% 121.85/120.82  all VarCurr (v51(VarCurr)<->v53(VarCurr)).
% 121.85/120.82  all VarCurr (v53(VarCurr)<->v55(VarCurr)).
% 121.85/120.82  all VarCurr (v55(VarCurr)<->v57(VarCurr)).
% 121.85/120.82  all VarCurr (v57(VarCurr)<->v59(VarCurr)).
% 121.85/120.82  all VarCurr (v59(VarCurr)<->v61(VarCurr)).
% 121.85/120.82  all VarCurr (v61(VarCurr)<->v63(VarCurr,bitIndex1)).
% 121.85/120.82  all VarNext (v63(VarNext,bitIndex1)<->v538(VarNext,bitIndex0)).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v539(VarNext)-> (v538(VarNext,bitIndex1)<->v63(VarCurr,bitIndex2))& (v538(VarNext,bitIndex0)<->v63(VarCurr,bitIndex1)))).
% 121.85/120.82  all VarNext (v539(VarNext)-> (all B (range_1_0(B)-> (v538(VarNext,B)<->v549(VarNext,B))))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_1_0(B)-> (v549(VarNext,B)<->v547(VarCurr,B))))).
% 121.85/120.82  all VarCurr (-v550(VarCurr)-> (v547(VarCurr,bitIndex1)<->v72(VarCurr,bitIndex2))& (v547(VarCurr,bitIndex0)<->v72(VarCurr,bitIndex1))).
% 121.85/120.82  all VarCurr (v550(VarCurr)-> (all B (range_1_0(B)-> (v547(VarCurr,B)<->$F)))).
% 121.85/120.82  all B (range_1_0(B)<->bitIndex0=B|bitIndex1=B).
% 121.85/120.82  all VarCurr (-v550(VarCurr)<->v65(VarCurr)).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v539(VarNext)<->v540(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v540(VarNext)<->v541(VarNext)&v532(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v541(VarNext)<->v543(VarNext))).
% 121.85/120.82  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v543(VarNext)<->v532(VarCurr))).
% 121.85/120.82  all VarCurr (v532(VarCurr)<->v534(VarCurr)).
% 121.85/120.82  all VarCurr (v534(VarCurr)<->v536(VarCurr)).
% 121.85/120.82  all VarCurr (v536(VarCurr)<->v1(VarCurr)).
% 121.85/120.82  all VarCurr (-v507(VarCurr)-> (v72(VarCurr,bitIndex1)<->$F)).
% 121.85/120.82  all VarCurr (v507(VarCurr)-> (v72(VarCurr,bitIndex1)<->$T)).
% 121.85/120.82  all VarCurr (v507(VarCurr)<->v508(VarCurr)|v526(VarCurr)).
% 121.85/120.82  all VarCurr (v526(VarCurr)<->v527(VarCurr)&v528(VarCurr)).
% 121.85/120.82  all VarCurr (v528(VarCurr)<->v529(VarCurr)&v530(VarCurr)).
% 121.85/120.82  all VarCurr (v530(VarCurr)<-> ($T<->v63(VarCurr,bitIndex2))).
% 121.85/120.82  all VarCurr (-v529(VarCurr)<->v464(VarCurr)).
% 121.85/120.83  all VarCurr (-v527(VarCurr)<->v523(VarCurr)).
% 121.85/120.83  all VarCurr (v508(VarCurr)<->v509(VarCurr)&v523(VarCurr)).
% 121.85/120.83  all VarCurr (v523(VarCurr)<->v524(VarCurr)|v525(VarCurr)).
% 121.85/120.83  all VarCurr (v525(VarCurr)<-> ($T<->v63(VarCurr,bitIndex1))).
% 121.85/120.83  all VarCurr (v524(VarCurr)<-> ($T<->v63(VarCurr,bitIndex0))).
% 121.85/120.83  v63(constB0,bitIndex2)<->$F.
% 121.85/120.83  v63(constB0,bitIndex1)<->$F.
% 121.85/120.83  -b00(bitIndex1).
% 121.85/120.83  -b00(bitIndex0).
% 121.85/120.83  v63(constB0,bitIndex0)<->$T.
% 121.85/120.83  all VarCurr (v509(VarCurr)<->v510(VarCurr)|v519(VarCurr)).
% 121.85/120.83  all VarCurr (v519(VarCurr)<->v521(VarCurr)&v518(VarCurr)).
% 121.85/120.83  all VarCurr (v521(VarCurr)<->v514(VarCurr)&v522(VarCurr)).
% 121.85/120.83  all VarCurr (-v522(VarCurr)<->v484(VarCurr)).
% 121.85/120.83  all VarCurr (v510(VarCurr)<->v512(VarCurr)&v518(VarCurr)).
% 121.85/120.83  all VarCurr (-v518(VarCurr)<->v93(VarCurr)).
% 121.85/120.83  all VarCurr (v512(VarCurr)<->v513(VarCurr)&v484(VarCurr)).
% 121.85/120.83  all VarCurr (v513(VarCurr)<->v514(VarCurr)&v517(VarCurr)).
% 121.85/120.83  all VarCurr (-v517(VarCurr)<->v503(VarCurr)).
% 121.85/120.83  v503(constB0)<->$F.
% 121.85/120.83  all VarCurr (v514(VarCurr)<->v515(VarCurr)&v516(VarCurr)).
% 121.85/120.83  all VarCurr (-v516(VarCurr)<->v74(VarCurr)).
% 121.85/120.83  all VarCurr (-v515(VarCurr)<->v464(VarCurr)).
% 121.85/120.83  all VarCurr (v484(VarCurr)<->v501(VarCurr)|v95(VarCurr)).
% 121.85/120.83  all VarCurr (v501(VarCurr)<->v419(VarCurr)|v486(VarCurr)).
% 121.85/120.83  all VarCurr (v486(VarCurr)<->v488(VarCurr)).
% 121.85/120.83  all VarCurr (v488(VarCurr)<->v490(VarCurr)).
% 121.85/120.83  all VarCurr (-v493(VarCurr)-> (v490(VarCurr)<->$F)).
% 121.85/120.83  all VarCurr (v493(VarCurr)-> (v490(VarCurr)<->$T)).
% 121.85/120.83  all VarCurr (v493(VarCurr)<->v495(VarCurr)&v170(VarCurr,bitIndex6)).
% 121.85/120.83  all VarCurr (v495(VarCurr)<->v496(VarCurr)&v367(VarCurr)).
% 121.85/120.83  all VarCurr (v496(VarCurr)<->v497(VarCurr)&v366(VarCurr)).
% 121.85/120.83  all VarCurr (v497(VarCurr)<->v498(VarCurr)&v170(VarCurr,bitIndex3)).
% 121.85/120.83  all VarCurr (v498(VarCurr)<->v499(VarCurr)&v364(VarCurr)).
% 121.85/120.83  all VarCurr (v499(VarCurr)<->v362(VarCurr)&v170(VarCurr,bitIndex1)).
% 121.85/120.83  all VarCurr (v464(VarCurr)<->v466(VarCurr)).
% 121.85/120.83  all VarCurr (v466(VarCurr)<->v468(VarCurr)).
% 121.85/120.83  all VarCurr (v468(VarCurr)<->v470(VarCurr)).
% 121.85/120.83  all VarCurr (v470(VarCurr)<->v472(VarCurr)).
% 121.85/120.83  all VarCurr (v472(VarCurr)<->v474(VarCurr)).
% 121.85/120.83  all VarCurr (v474(VarCurr)<->v476(VarCurr)).
% 121.85/120.83  all VarCurr (v476(VarCurr)<->v478(VarCurr)).
% 121.85/120.83  all VarCurr (v478(VarCurr)<->v480(VarCurr,bitIndex4)).
% 121.85/120.83  -v480(constB0,bitIndex4).
% 121.85/120.83  -v480(constB0,bitIndex2).
% 121.85/120.83  -v480(constB0,bitIndex1).
% 121.85/120.83  -v480(constB0,bitIndex0).
% 121.85/120.83  -bx0x000(bitIndex4).
% 121.85/120.83  -bx0x000(bitIndex2).
% 121.85/120.83  -bx0x000(bitIndex1).
% 121.85/120.83  -bx0x000(bitIndex0).
% 121.85/120.83  all VarCurr (v93(VarCurr)<->v460(VarCurr)|v461(VarCurr)).
% 121.85/120.83  all VarCurr (v461(VarCurr)<->v462(VarCurr)&v429(VarCurr)).
% 121.85/120.83  all VarCurr (v462(VarCurr)<->v406(VarCurr)|v419(VarCurr)).
% 121.85/120.83  all VarCurr (v460(VarCurr)<->v95(VarCurr)&v374(VarCurr)).
% 121.85/120.83  all VarCurr (v429(VarCurr)<->v431(VarCurr)).
% 121.85/120.83  all VarCurr (v431(VarCurr)<->v433(VarCurr)).
% 121.85/120.83  all VarCurr (v433(VarCurr)<->v457(VarCurr)&v458(VarCurr)).
% 121.85/120.83  all VarCurr (v458(VarCurr)<-> -(v435(VarCurr,bitIndex4)<->v455(VarCurr,bitIndex4))).
% 121.85/120.83  all VarCurr (v457(VarCurr)<-> (v435(VarCurr,bitIndex3)<->v455(VarCurr,bitIndex3))& (v435(VarCurr,bitIndex2)<->v455(VarCurr,bitIndex2))& (v435(VarCurr,bitIndex1)<->v455(VarCurr,bitIndex1))& (v435(VarCurr,bitIndex0)<->v455(VarCurr,bitIndex0))).
% 121.85/120.83  v455(constB0,bitIndex4)<->$F.
% 121.85/120.83  all B (range_3_0(B)-> (v455(constB0,B)<->$F)).
% 121.85/120.83  all VarCurr (v435(VarCurr,bitIndex4)<->v437(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v437(VarCurr,bitIndex4)<->v439(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v439(VarCurr,bitIndex4)<->v441(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v441(VarCurr,bitIndex4)<->v443(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v443(VarCurr,bitIndex4)<->v445(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v445(VarCurr,bitIndex4)<->v447(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v447(VarCurr,bitIndex4)<->v450(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v435(VarCurr,B)<->v437(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v437(VarCurr,B)<->v439(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v439(VarCurr,B)<->v441(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v441(VarCurr,B)<->v443(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v443(VarCurr,B)<->v445(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v445(VarCurr,B)<->v447(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v447(VarCurr,B)<->v450(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_3_0(B)-> (v450(VarCurr,B)<->v449(VarCurr,B))).
% 121.85/120.83  all VarCurr (v450(VarCurr,bitIndex4)<->v451(VarCurr)).
% 121.85/120.83  all B (range_3_0(B)-> (v449(constB0,B)<->$F)).
% 121.85/120.83  all B (range_3_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B).
% 121.85/120.83  -b0000(bitIndex3).
% 121.85/120.83  -b0000(bitIndex2).
% 121.85/120.83  -b0000(bitIndex1).
% 121.85/120.83  -b0000(bitIndex0).
% 121.85/120.83  v451(constB0)<->$F.
% 121.85/120.83  all VarCurr (v419(VarCurr)<->v421(VarCurr)).
% 121.85/120.83  all VarCurr (v421(VarCurr)<->v423(VarCurr)).
% 121.85/120.83  all VarCurr (-v425(VarCurr)-> (v423(VarCurr)<->$F)).
% 121.85/120.83  all VarCurr (v425(VarCurr)-> (v423(VarCurr)<->$T)).
% 121.85/120.83  all VarCurr (v425(VarCurr)<->v426(VarCurr)|v427(VarCurr)).
% 121.85/120.83  all VarCurr (v427(VarCurr)<->v173(VarCurr)&v370(VarCurr)).
% 121.85/120.83  all VarCurr (v426(VarCurr)<->v101(VarCurr)&v355(VarCurr)).
% 121.85/120.83  all VarCurr (v406(VarCurr)<->v408(VarCurr)).
% 121.85/120.83  all VarCurr (v408(VarCurr)<->v410(VarCurr)).
% 121.85/120.83  all VarCurr (-v413(VarCurr)-> (v410(VarCurr)<->$F)).
% 121.85/120.83  all VarCurr (v413(VarCurr)-> (v410(VarCurr)<->$T)).
% 121.85/120.83  all VarCurr (v413(VarCurr)<->v415(VarCurr)&v417(VarCurr)).
% 121.85/120.83  all VarCurr (-v417(VarCurr)<->v170(VarCurr,bitIndex6)).
% 121.85/120.83  all VarCurr (v415(VarCurr)<->v416(VarCurr)&v170(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v416(VarCurr)<->v365(VarCurr)&v170(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v374(VarCurr)<->v376(VarCurr)).
% 121.85/120.83  all VarCurr (v376(VarCurr)<->v378(VarCurr)).
% 121.85/120.83  all VarCurr (v378(VarCurr)<->v402(VarCurr)&v404(VarCurr)).
% 121.85/120.83  all VarCurr (v404(VarCurr)<-> -(v380(VarCurr,bitIndex5)<->v400(VarCurr,bitIndex5))).
% 121.85/120.83  all VarCurr (v402(VarCurr)<-> (v380(VarCurr,bitIndex4)<->v400(VarCurr,bitIndex4))& (v380(VarCurr,bitIndex3)<->v400(VarCurr,bitIndex3))& (v380(VarCurr,bitIndex2)<->v400(VarCurr,bitIndex2))& (v380(VarCurr,bitIndex1)<->v400(VarCurr,bitIndex1))& (v380(VarCurr,bitIndex0)<->v400(VarCurr,bitIndex0))).
% 121.85/120.83  all B (range_5_0(B)-> (v400(constB0,B)<->$F)).
% 121.85/120.83  all B (range_5_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B).
% 121.85/120.83  -b000000(bitIndex5).
% 121.85/120.83  -b000000(bitIndex4).
% 121.85/120.83  -b000000(bitIndex3).
% 121.85/120.83  -b000000(bitIndex2).
% 121.85/120.83  -b000000(bitIndex1).
% 121.85/120.83  -b000000(bitIndex0).
% 121.85/120.83  all VarCurr (v380(VarCurr,bitIndex5)<->v382(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v382(VarCurr,bitIndex5)<->v384(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v384(VarCurr,bitIndex5)<->v386(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v386(VarCurr,bitIndex5)<->v388(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v388(VarCurr,bitIndex5)<->v390(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v390(VarCurr,bitIndex5)<->v392(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v392(VarCurr,bitIndex5)<->v395(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v380(VarCurr,B)<->v382(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v382(VarCurr,B)<->v384(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v384(VarCurr,B)<->v386(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v386(VarCurr,B)<->v388(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v388(VarCurr,B)<->v390(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v390(VarCurr,B)<->v392(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v392(VarCurr,B)<->v395(VarCurr,B))).
% 121.85/120.83  all VarCurr B (range_4_0(B)-> (v395(VarCurr,B)<->v394(VarCurr,B))).
% 121.85/120.83  all VarCurr (v395(VarCurr,bitIndex5)<->v396(VarCurr)).
% 121.85/120.83  all B (range_4_0(B)-> (v394(constB0,B)<->$F)).
% 121.85/120.83  all B (range_4_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B).
% 121.85/120.83  -b00000(bitIndex4).
% 121.85/120.83  -b00000(bitIndex3).
% 121.85/120.83  -b00000(bitIndex2).
% 121.85/120.83  -b00000(bitIndex1).
% 121.85/120.83  -b00000(bitIndex0).
% 121.85/120.83  v396(constB0)<->$F.
% 121.85/120.83  all VarCurr (v95(VarCurr)<->v97(VarCurr)).
% 121.85/120.83  all VarCurr (v97(VarCurr)<->v99(VarCurr)).
% 121.85/120.83  all VarCurr (-v352(VarCurr)-> (v99(VarCurr)<->$F)).
% 121.85/120.83  all VarCurr (v352(VarCurr)-> (v99(VarCurr)<->$T)).
% 121.85/120.83  all VarCurr (v352(VarCurr)<->v353(VarCurr)|v368(VarCurr)).
% 121.85/120.83  all VarCurr (v368(VarCurr)<->v369(VarCurr)&v370(VarCurr)).
% 121.85/120.83  all VarCurr (v370(VarCurr)<->v372(VarCurr)&v170(VarCurr,bitIndex6)).
% 121.85/120.83  all VarCurr (v372(VarCurr)<->v358(VarCurr)&v170(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (-v369(VarCurr)<->v173(VarCurr)).
% 121.85/120.83  all VarCurr (v353(VarCurr)<->v354(VarCurr)&v355(VarCurr)).
% 121.85/120.83  all VarCurr (v355(VarCurr)<->v357(VarCurr)&v170(VarCurr,bitIndex6)).
% 121.85/120.83  all VarCurr (v357(VarCurr)<->v358(VarCurr)&v367(VarCurr)).
% 121.85/120.83  all VarCurr (-v367(VarCurr)<->v170(VarCurr,bitIndex5)).
% 121.85/120.83  all VarCurr (v358(VarCurr)<->v359(VarCurr)&v366(VarCurr)).
% 121.85/120.83  all VarCurr (-v366(VarCurr)<->v170(VarCurr,bitIndex4)).
% 121.85/120.83  all VarCurr (v359(VarCurr)<->v360(VarCurr)&v365(VarCurr)).
% 121.85/120.84  all VarCurr (-v365(VarCurr)<->v170(VarCurr,bitIndex3)).
% 121.85/120.84  all VarCurr (v360(VarCurr)<->v361(VarCurr)&v364(VarCurr)).
% 121.85/120.84  all VarCurr (-v364(VarCurr)<->v170(VarCurr,bitIndex2)).
% 121.85/120.84  all VarCurr (v361(VarCurr)<->v362(VarCurr)&v363(VarCurr)).
% 121.85/120.84  all VarCurr (-v363(VarCurr)<->v170(VarCurr,bitIndex1)).
% 121.85/120.84  all VarCurr (-v362(VarCurr)<->v170(VarCurr,bitIndex0)).
% 121.85/120.84  all VarCurr (-v354(VarCurr)<->v101(VarCurr)).
% 121.85/120.84  all VarCurr (-v345(VarCurr)-> (v173(VarCurr)<->v348(VarCurr))).
% 121.85/120.84  all VarCurr (v345(VarCurr)-> (v173(VarCurr)<->v347(VarCurr))).
% 121.85/120.84  all VarCurr (v348(VarCurr)<-> (v103(VarCurr,bitIndex61)<->v227(VarCurr,bitIndex47))& (v103(VarCurr,bitIndex60)<->v227(VarCurr,bitIndex46))& (v103(VarCurr,bitIndex59)<->v227(VarCurr,bitIndex45))& (v103(VarCurr,bitIndex58)<->v227(VarCurr,bitIndex44))& (v103(VarCurr,bitIndex57)<->v227(VarCurr,bitIndex43))& (v103(VarCurr,bitIndex56)<->v227(VarCurr,bitIndex42))& (v103(VarCurr,bitIndex55)<->v227(VarCurr,bitIndex41))& (v103(VarCurr,bitIndex54)<->v227(VarCurr,bitIndex40))& (v103(VarCurr,bitIndex53)<->v227(VarCurr,bitIndex39))& (v103(VarCurr,bitIndex52)<->v227(VarCurr,bitIndex38))& (v103(VarCurr,bitIndex51)<->v227(VarCurr,bitIndex37))& (v103(VarCurr,bitIndex50)<->v227(VarCurr,bitIndex36))& (v103(VarCurr,bitIndex49)<->v227(VarCurr,bitIndex35))& (v103(VarCurr,bitIndex48)<->v227(VarCurr,bitIndex34))& (v103(VarCurr,bitIndex47)<->v227(VarCurr,bitIndex33))& (v103(VarCurr,bitIndex46)<->v227(VarCurr,bitIndex32))& (v103(VarCurr,bitIndex45)<->v227(VarCurr,bitIndex31))& (v103(VarCurr,bitIndex44)<->v227(VarCurr,bitIndex30))& (v103(VarCurr,bitIndex43)<->v227(VarCurr,bitIndex29))& (v103(VarCurr,bitIndex42)<->v227(VarCurr,bitIndex28))& (v103(VarCurr,bitIndex41)<->v227(VarCurr,bitIndex27))& (v103(VarCurr,bitIndex40)<->v227(VarCurr,bitIndex26))& (v103(VarCurr,bitIndex39)<->v227(VarCurr,bitIndex25))& (v103(VarCurr,bitIndex38)<->v227(VarCurr,bitIndex24))& (v103(VarCurr,bitIndex37)<->v227(VarCurr,bitIndex23))& (v103(VarCurr,bitIndex36)<->v227(VarCurr,bitIndex22))& (v103(VarCurr,bitIndex35)<->v227(VarCurr,bitIndex21))& (v103(VarCurr,bitIndex34)<->v227(VarCurr,bitIndex20))& (v103(VarCurr,bitIndex33)<->v227(VarCurr,bitIndex19))& (v103(VarCurr,bitIndex32)<->v227(VarCurr,bitIndex18))& (v103(VarCurr,bitIndex31)<->v227(VarCurr,bitIndex17))& (v103(VarCurr,bitIndex30)<->v227(VarCurr,bitIndex16))& (v103(VarCurr,bitIndex29)<->v227(VarCurr,bitIndex15))& (v103(VarCurr,bitIndex28)<->v227(VarCurr,bitIndex14))& (v103(VarCurr,bitIndex27)<->v227(VarCurr,bitIndex13))& (v103(VarCurr,bitIndex26)<->v227(VarCurr,bitIndex12))& (v103(VarCurr,bitIndex25)<->v227(VarCurr,bitIndex11))& (v103(VarCurr,bitIndex24)<->v227(VarCurr,bitIndex10))& (v103(VarCurr,bitIndex23)<->v227(VarCurr,bitIndex9))& (v103(VarCurr,bitIndex22)<->v227(VarCurr,bitIndex8))& (v103(VarCurr,bitIndex21)<->v227(VarCurr,bitIndex7))& (v103(VarCurr,bitIndex20)<->v227(VarCurr,bitIndex6))& (v103(VarCurr,bitIndex19)<->v227(VarCurr,bitIndex5))& (v103(VarCurr,bitIndex18)<->v227(VarCurr,bitIndex4))& (v103(VarCurr,bitIndex17)<->v227(VarCurr,bitIndex3))& (v103(VarCurr,bitIndex16)<->v227(VarCurr,bitIndex2))& (v103(VarCurr,bitIndex15)<->v227(VarCurr,bitIndex1))& (v103(VarCurr,bitIndex14)<->v227(VarCurr,bitIndex0))).
% 121.85/120.84  all VarCurr (v347(VarCurr)<-> (v103(VarCurr,bitIndex60)<->v227(VarCurr,bitIndex46))& (v103(VarCurr,bitIndex59)<->v227(VarCurr,bitIndex45))& (v103(VarCurr,bitIndex58)<->v227(VarCurr,bitIndex44))& (v103(VarCurr,bitIndex57)<->v227(VarCurr,bitIndex43))& (v103(VarCurr,bitIndex56)<->v227(VarCurr,bitIndex42))& (v103(VarCurr,bitIndex55)<->v227(VarCurr,bitIndex41))& (v103(VarCurr,bitIndex54)<->v227(VarCurr,bitIndex40))& (v103(VarCurr,bitIndex53)<->v227(VarCurr,bitIndex39))& (v103(VarCurr,bitIndex52)<->v227(VarCurr,bitIndex38))& (v103(VarCurr,bitIndex51)<->v227(VarCurr,bitIndex37))& (v103(VarCurr,bitIndex50)<->v227(VarCurr,bitIndex36))& (v103(VarCurr,bitIndex49)<->v227(VarCurr,bitIndex35))& (v103(VarCurr,bitIndex48)<->v227(VarCurr,bitIndex34))& (v103(VarCurr,bitIndex47)<->v227(VarCurr,bitIndex33))& (v103(VarCurr,bitIndex46)<->v227(VarCurr,bitIndex32))& (v103(VarCurr,bitIndex45)<->v227(VarCurr,bitIndex31))& (v103(VarCurr,bitIndex44)<->v227(VarCurr,bitIndex30))& (v103(VarCurr,bitIndex43)<->v227(VarCurr,bitIndex29))& (v103(VarCurr,bitIndex42)<->v227(VarCurr,bitIndex28))& (v103(VarCurr,bitIndex41)<->v227(VarCurr,bitIndex27))& (v103(VarCurr,bitIndex40)<->v227(VarCurr,bitIndex26))& (v103(VarCurr,bitIndex39)<->v227(VarCurr,bitIndex25))& (v103(VarCurr,bitIndex38)<->v227(VarCurr,bitIndex24))& (v103(VarCurr,bitIndex37)<->v227(VarCurr,bitIndex23))& (v103(VarCurr,bitIndex36)<->v227(VarCurr,bitIndex22))& (v103(VarCurr,bitIndex35)<->v227(VarCurr,bitIndex21))& (v103(VarCurr,bitIndex34)<->v227(VarCurr,bitIndex20))& (v103(VarCurr,bitIndex33)<->v227(VarCurr,bitIndex19))& (v103(VarCurr,bitIndex32)<->v227(VarCurr,bitIndex18))& (v103(VarCurr,bitIndex31)<->v227(VarCurr,bitIndex17))& (v103(VarCurr,bitIndex30)<->v227(VarCurr,bitIndex16))& (v103(VarCurr,bitIndex29)<->v227(VarCurr,bitIndex15))& (v103(VarCurr,bitIndex28)<->v227(VarCurr,bitIndex14))& (v103(VarCurr,bitIndex27)<->v227(VarCurr,bitIndex13))& (v103(VarCurr,bitIndex26)<->v227(VarCurr,bitIndex12))& (v103(VarCurr,bitIndex25)<->v227(VarCurr,bitIndex11))& (v103(VarCurr,bitIndex24)<->v227(VarCurr,bitIndex10))& (v103(VarCurr,bitIndex23)<->v227(VarCurr,bitIndex9))& (v103(VarCurr,bitIndex22)<->v227(VarCurr,bitIndex8))& (v103(VarCurr,bitIndex21)<->v227(VarCurr,bitIndex7))& (v103(VarCurr,bitIndex20)<->v227(VarCurr,bitIndex6))& (v103(VarCurr,bitIndex19)<->v227(VarCurr,bitIndex5))& (v103(VarCurr,bitIndex18)<->v227(VarCurr,bitIndex4))& (v103(VarCurr,bitIndex17)<->v227(VarCurr,bitIndex3))& (v103(VarCurr,bitIndex16)<->v227(VarCurr,bitIndex2))& (v103(VarCurr,bitIndex15)<->v227(VarCurr,bitIndex1))& (v103(VarCurr,bitIndex14)<->v227(VarCurr,bitIndex0))).
% 121.85/120.84  all VarCurr (v345(VarCurr)<->v175(VarCurr)&v346(VarCurr)).
% 121.85/120.84  all VarCurr (-v346(VarCurr)<->v199(VarCurr)).
% 121.85/120.84  all VarCurr (v227(VarCurr,bitIndex47)<->v229(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v229(VarCurr,bitIndex47)<->v231(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v231(VarCurr,bitIndex47)<->v233(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v233(VarCurr,bitIndex47)<->v235(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v235(VarCurr,bitIndex47)<->v237(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v237(VarCurr,bitIndex47)<->v239(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v239(VarCurr,bitIndex47)<->v241(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v241(VarCurr,bitIndex47)<->v243(VarCurr,bitIndex47)).
% 121.85/120.84  all VarCurr (v243(VarCurr,bitIndex47)<->v245(VarCurr,bitIndex63)).
% 121.85/120.84  all VarCurr (v245(VarCurr,bitIndex63)<->v247(VarCurr,bitIndex63)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex63)<->v343(VarCurr)).
% 121.85/120.84  all VarCurr (v103(VarCurr,bitIndex61)<->v105(VarCurr,bitIndex61)).
% 121.85/120.84  all VarCurr (v105(VarCurr,bitIndex61)<->v107(VarCurr,bitIndex61)).
% 121.85/120.84  all VarCurr (v107(VarCurr,bitIndex61)<->v109(VarCurr,bitIndex61)).
% 121.85/120.84  all VarCurr (v109(VarCurr,bitIndex61)<->v111(VarCurr,bitIndex641)).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v227(VarCurr,B)<->v229(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v229(VarCurr,B)<->v231(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v231(VarCurr,B)<->v233(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v233(VarCurr,B)<->v235(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v235(VarCurr,B)<->v237(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v237(VarCurr,B)<->v239(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v239(VarCurr,B)<->v241(VarCurr,B))).
% 121.85/120.84  all VarCurr B (range_46_0(B)-> (v241(VarCurr,B)<->v243(VarCurr,B))).
% 121.85/120.84  all B (range_46_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B).
% 121.85/120.84  all VarCurr ((v243(VarCurr,bitIndex46)<->v245(VarCurr,bitIndex62))& (v243(VarCurr,bitIndex45)<->v245(VarCurr,bitIndex61))& (v243(VarCurr,bitIndex44)<->v245(VarCurr,bitIndex60))& (v243(VarCurr,bitIndex43)<->v245(VarCurr,bitIndex59))& (v243(VarCurr,bitIndex42)<->v245(VarCurr,bitIndex58))& (v243(VarCurr,bitIndex41)<->v245(VarCurr,bitIndex57))& (v243(VarCurr,bitIndex40)<->v245(VarCurr,bitIndex56))& (v243(VarCurr,bitIndex39)<->v245(VarCurr,bitIndex55))& (v243(VarCurr,bitIndex38)<->v245(VarCurr,bitIndex54))& (v243(VarCurr,bitIndex37)<->v245(VarCurr,bitIndex53))& (v243(VarCurr,bitIndex36)<->v245(VarCurr,bitIndex52))& (v243(VarCurr,bitIndex35)<->v245(VarCurr,bitIndex51))& (v243(VarCurr,bitIndex34)<->v245(VarCurr,bitIndex50))& (v243(VarCurr,bitIndex33)<->v245(VarCurr,bitIndex49))& (v243(VarCurr,bitIndex32)<->v245(VarCurr,bitIndex48))& (v243(VarCurr,bitIndex31)<->v245(VarCurr,bitIndex47))& (v243(VarCurr,bitIndex30)<->v245(VarCurr,bitIndex46))& (v243(VarCurr,bitIndex29)<->v245(VarCurr,bitIndex45))& (v243(VarCurr,bitIndex28)<->v245(VarCurr,bitIndex44))& (v243(VarCurr,bitIndex27)<->v245(VarCurr,bitIndex43))& (v243(VarCurr,bitIndex26)<->v245(VarCurr,bitIndex42))& (v243(VarCurr,bitIndex25)<->v245(VarCurr,bitIndex41))& (v243(VarCurr,bitIndex24)<->v245(VarCurr,bitIndex40))& (v243(VarCurr,bitIndex23)<->v245(VarCurr,bitIndex39))& (v243(VarCurr,bitIndex22)<->v245(VarCurr,bitIndex38))& (v243(VarCurr,bitIndex21)<->v245(VarCurr,bitIndex37))& (v243(VarCurr,bitIndex20)<->v245(VarCurr,bitIndex36))& (v243(VarCurr,bitIndex19)<->v245(VarCurr,bitIndex35))& (v243(VarCurr,bitIndex18)<->v245(VarCurr,bitIndex34))& (v243(VarCurr,bitIndex17)<->v245(VarCurr,bitIndex33))& (v243(VarCurr,bitIndex16)<->v245(VarCurr,bitIndex32))& (v243(VarCurr,bitIndex15)<->v245(VarCurr,bitIndex31))& (v243(VarCurr,bitIndex14)<->v245(VarCurr,bitIndex30))& (v243(VarCurr,bitIndex13)<->v245(VarCurr,bitIndex29))& (v243(VarCurr,bitIndex12)<->v245(VarCurr,bitIndex28))& (v243(VarCurr,bitIndex11)<->v245(VarCurr,bitIndex27))& (v243(VarCurr,bitIndex10)<->v245(VarCurr,bitIndex26))& (v243(VarCurr,bitIndex9)<->v245(VarCurr,bitIndex25))& (v243(VarCurr,bitIndex8)<->v245(VarCurr,bitIndex24))& (v243(VarCurr,bitIndex7)<->v245(VarCurr,bitIndex23))& (v243(VarCurr,bitIndex6)<->v245(VarCurr,bitIndex22))& (v243(VarCurr,bitIndex5)<->v245(VarCurr,bitIndex21))& (v243(VarCurr,bitIndex4)<->v245(VarCurr,bitIndex20))& (v243(VarCurr,bitIndex3)<->v245(VarCurr,bitIndex19))& (v243(VarCurr,bitIndex2)<->v245(VarCurr,bitIndex18))& (v243(VarCurr,bitIndex1)<->v245(VarCurr,bitIndex17))& (v243(VarCurr,bitIndex0)<->v245(VarCurr,bitIndex16))).
% 121.85/120.84  all VarCurr B (range_62_16(B)-> (v245(VarCurr,B)<->v247(VarCurr,B))).
% 121.85/120.84  all B (range_62_16(B)<->bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex62)<->v341(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex61)<->v339(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex60)<->v337(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex59)<->v335(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex58)<->v333(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex57)<->v331(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex56)<->v329(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex55)<->v327(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex54)<->v325(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex53)<->v323(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex52)<->v321(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex51)<->v319(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex50)<->v317(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex49)<->v315(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex48)<->v313(VarCurr)).
% 121.85/120.84  all VarCurr (v247(VarCurr,bitIndex47)<->v311(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex46)<->v309(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex45)<->v307(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex44)<->v305(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex43)<->v303(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex42)<->v301(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex41)<->v299(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex40)<->v297(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex39)<->v295(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex38)<->v293(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex37)<->v291(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex36)<->v289(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex35)<->v287(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex34)<->v285(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex33)<->v283(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex32)<->v281(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex31)<->v279(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex30)<->v277(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex29)<->v275(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex28)<->v273(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex27)<->v271(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex26)<->v269(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex25)<->v267(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex24)<->v265(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex23)<->v263(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex22)<->v261(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex21)<->v259(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex20)<->v257(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex19)<->v255(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex18)<->v253(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex17)<->v251(VarCurr)).
% 121.85/120.85  all VarCurr (v247(VarCurr,bitIndex16)<->v249(VarCurr)).
% 121.85/120.85  all VarCurr B (range_60_30(B)-> (v103(VarCurr,B)<->v105(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_60_30(B)-> (v105(VarCurr,B)<->v107(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_60_30(B)-> (v107(VarCurr,B)<->v109(VarCurr,B))).
% 121.85/120.85  all B (range_60_30(B)<->bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B).
% 121.85/120.85  all VarCurr ((v109(VarCurr,bitIndex60)<->v111(VarCurr,bitIndex640))& (v109(VarCurr,bitIndex59)<->v111(VarCurr,bitIndex639))& (v109(VarCurr,bitIndex58)<->v111(VarCurr,bitIndex638))& (v109(VarCurr,bitIndex57)<->v111(VarCurr,bitIndex637))& (v109(VarCurr,bitIndex56)<->v111(VarCurr,bitIndex636))& (v109(VarCurr,bitIndex55)<->v111(VarCurr,bitIndex635))& (v109(VarCurr,bitIndex54)<->v111(VarCurr,bitIndex634))& (v109(VarCurr,bitIndex53)<->v111(VarCurr,bitIndex633))& (v109(VarCurr,bitIndex52)<->v111(VarCurr,bitIndex632))& (v109(VarCurr,bitIndex51)<->v111(VarCurr,bitIndex631))& (v109(VarCurr,bitIndex50)<->v111(VarCurr,bitIndex630))& (v109(VarCurr,bitIndex49)<->v111(VarCurr,bitIndex629))& (v109(VarCurr,bitIndex48)<->v111(VarCurr,bitIndex628))& (v109(VarCurr,bitIndex47)<->v111(VarCurr,bitIndex627))& (v109(VarCurr,bitIndex46)<->v111(VarCurr,bitIndex626))& (v109(VarCurr,bitIndex45)<->v111(VarCurr,bitIndex625))& (v109(VarCurr,bitIndex44)<->v111(VarCurr,bitIndex624))& (v109(VarCurr,bitIndex43)<->v111(VarCurr,bitIndex623))& (v109(VarCurr,bitIndex42)<->v111(VarCurr,bitIndex622))& (v109(VarCurr,bitIndex41)<->v111(VarCurr,bitIndex621))& (v109(VarCurr,bitIndex40)<->v111(VarCurr,bitIndex620))& (v109(VarCurr,bitIndex39)<->v111(VarCurr,bitIndex619))& (v109(VarCurr,bitIndex38)<->v111(VarCurr,bitIndex618))& (v109(VarCurr,bitIndex37)<->v111(VarCurr,bitIndex617))& (v109(VarCurr,bitIndex36)<->v111(VarCurr,bitIndex616))& (v109(VarCurr,bitIndex35)<->v111(VarCurr,bitIndex615))& (v109(VarCurr,bitIndex34)<->v111(VarCurr,bitIndex614))& (v109(VarCurr,bitIndex33)<->v111(VarCurr,bitIndex613))& (v109(VarCurr,bitIndex32)<->v111(VarCurr,bitIndex612))& (v109(VarCurr,bitIndex31)<->v111(VarCurr,bitIndex611))& (v109(VarCurr,bitIndex30)<->v111(VarCurr,bitIndex610))).
% 121.85/120.85  all VarCurr (v199(VarCurr)<->v201(VarCurr)).
% 121.85/120.85  all VarCurr (v201(VarCurr)<->v203(VarCurr)).
% 121.85/120.85  all VarCurr (v203(VarCurr)<->v205(VarCurr)).
% 121.85/120.85  all VarCurr (v205(VarCurr)<->v207(VarCurr)).
% 121.85/120.85  all VarCurr (v207(VarCurr)<->v209(VarCurr)).
% 121.85/120.85  all VarCurr (v209(VarCurr)<->v211(VarCurr)).
% 121.85/120.85  all VarCurr (v211(VarCurr)<->v213(VarCurr,bitIndex44)).
% 121.85/120.85  all VarCurr (v213(VarCurr,bitIndex44)<->v215(VarCurr,bitIndex44)).
% 121.85/120.85  all VarCurr (v215(VarCurr,bitIndex44)<->v217(VarCurr,bitIndex44)).
% 121.85/120.85  all VarCurr (v217(VarCurr,bitIndex44)<->v219(VarCurr,bitIndex44)).
% 121.85/120.85  all VarCurr (v219(VarCurr,bitIndex44)<->v221(VarCurr,bitIndex63)).
% 121.85/120.85  all VarCurr (v221(VarCurr,bitIndex63)<->v223(VarCurr,bitIndex63)).
% 121.85/120.85  all VarCurr (v223(VarCurr,bitIndex63)<->v225(VarCurr)).
% 121.85/120.85  all VarCurr (v175(VarCurr)<->v177(VarCurr)).
% 121.85/120.85  all VarCurr (v177(VarCurr)<->v179(VarCurr)).
% 121.85/120.85  all VarCurr (v179(VarCurr)<->v181(VarCurr)).
% 121.85/120.85  all VarCurr (v181(VarCurr)<->v183(VarCurr)).
% 121.85/120.85  all VarCurr (v183(VarCurr)<->v185(VarCurr)).
% 121.85/120.85  all VarCurr (v185(VarCurr)<->v187(VarCurr)).
% 121.85/120.85  all VarCurr (v187(VarCurr)<->v189(VarCurr)).
% 121.85/120.85  all VarCurr (v189(VarCurr)<->v191(VarCurr)).
% 121.85/120.85  all VarCurr (v191(VarCurr)<->v193(VarCurr,bitIndex2)).
% 121.85/120.85  all VarCurr (v193(VarCurr,bitIndex2)<->v195(VarCurr,bitIndex2)).
% 121.85/120.85  all VarCurr (v195(VarCurr,bitIndex2)<->v197(VarCurr)).
% 121.85/120.85  all VarCurr ((v170(VarCurr,bitIndex6)<->v105(VarCurr,bitIndex115))& (v170(VarCurr,bitIndex5)<->v105(VarCurr,bitIndex114))& (v170(VarCurr,bitIndex4)<->v105(VarCurr,bitIndex113))& (v170(VarCurr,bitIndex3)<->v105(VarCurr,bitIndex112))& (v170(VarCurr,bitIndex2)<->v105(VarCurr,bitIndex111))& (v170(VarCurr,bitIndex1)<->v105(VarCurr,bitIndex110))& (v170(VarCurr,bitIndex0)<->v105(VarCurr,bitIndex109))).
% 121.85/120.85  all VarCurr B (range_115_109(B)-> (v105(VarCurr,B)<->v107(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_115_109(B)-> (v107(VarCurr,B)<->v109(VarCurr,B))).
% 121.85/120.85  all B (range_115_109(B)<->bitIndex109=B|bitIndex110=B|bitIndex111=B|bitIndex112=B|bitIndex113=B|bitIndex114=B|bitIndex115=B).
% 121.85/120.85  all VarCurr ((v109(VarCurr,bitIndex115)<->v111(VarCurr,bitIndex695))& (v109(VarCurr,bitIndex114)<->v111(VarCurr,bitIndex694))& (v109(VarCurr,bitIndex113)<->v111(VarCurr,bitIndex693))& (v109(VarCurr,bitIndex112)<->v111(VarCurr,bitIndex692))& (v109(VarCurr,bitIndex111)<->v111(VarCurr,bitIndex691))& (v109(VarCurr,bitIndex110)<->v111(VarCurr,bitIndex690))& (v109(VarCurr,bitIndex109)<->v111(VarCurr,bitIndex689))).
% 121.85/120.85  all VarCurr (v101(VarCurr)<-> (v103(VarCurr,bitIndex29)<->v115(VarCurr,bitIndex15))& (v103(VarCurr,bitIndex28)<->v115(VarCurr,bitIndex14))& (v103(VarCurr,bitIndex27)<->v115(VarCurr,bitIndex13))& (v103(VarCurr,bitIndex26)<->v115(VarCurr,bitIndex12))& (v103(VarCurr,bitIndex25)<->v115(VarCurr,bitIndex11))& (v103(VarCurr,bitIndex24)<->v115(VarCurr,bitIndex10))& (v103(VarCurr,bitIndex23)<->v115(VarCurr,bitIndex9))& (v103(VarCurr,bitIndex22)<->v115(VarCurr,bitIndex8))& (v103(VarCurr,bitIndex21)<->v115(VarCurr,bitIndex7))& (v103(VarCurr,bitIndex20)<->v115(VarCurr,bitIndex6))& (v103(VarCurr,bitIndex19)<->v115(VarCurr,bitIndex5))& (v103(VarCurr,bitIndex18)<->v115(VarCurr,bitIndex4))& (v103(VarCurr,bitIndex17)<->v115(VarCurr,bitIndex3))& (v103(VarCurr,bitIndex16)<->v115(VarCurr,bitIndex2))& (v103(VarCurr,bitIndex15)<->v115(VarCurr,bitIndex1))& (v103(VarCurr,bitIndex14)<->v115(VarCurr,bitIndex0))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v115(VarCurr,B)<->v117(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v117(VarCurr,B)<->v119(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v119(VarCurr,B)<->v121(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v121(VarCurr,B)<->v123(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v123(VarCurr,B)<->v125(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v125(VarCurr,B)<->v127(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v127(VarCurr,B)<->v129(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_15_0(B)-> (v129(VarCurr,B)<->v131(VarCurr,B))).
% 121.85/120.85  all B (range_15_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B).
% 121.85/120.85  all VarCurr ((v131(VarCurr,bitIndex15)<->v133(VarCurr,bitIndex31))& (v131(VarCurr,bitIndex14)<->v133(VarCurr,bitIndex30))& (v131(VarCurr,bitIndex13)<->v133(VarCurr,bitIndex29))& (v131(VarCurr,bitIndex12)<->v133(VarCurr,bitIndex28))& (v131(VarCurr,bitIndex11)<->v133(VarCurr,bitIndex27))& (v131(VarCurr,bitIndex10)<->v133(VarCurr,bitIndex26))& (v131(VarCurr,bitIndex9)<->v133(VarCurr,bitIndex25))& (v131(VarCurr,bitIndex8)<->v133(VarCurr,bitIndex24))& (v131(VarCurr,bitIndex7)<->v133(VarCurr,bitIndex23))& (v131(VarCurr,bitIndex6)<->v133(VarCurr,bitIndex22))& (v131(VarCurr,bitIndex5)<->v133(VarCurr,bitIndex21))& (v131(VarCurr,bitIndex4)<->v133(VarCurr,bitIndex20))& (v131(VarCurr,bitIndex3)<->v133(VarCurr,bitIndex19))& (v131(VarCurr,bitIndex2)<->v133(VarCurr,bitIndex18))& (v131(VarCurr,bitIndex1)<->v133(VarCurr,bitIndex17))& (v131(VarCurr,bitIndex0)<->v133(VarCurr,bitIndex16))).
% 121.85/120.85  all VarCurr B (range_31_16(B)-> (v133(VarCurr,B)<->v135(VarCurr,B))).
% 121.85/120.85  all B (range_31_16(B)<->bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex31)<->v167(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex30)<->v165(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex29)<->v163(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex28)<->v161(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex27)<->v159(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex26)<->v157(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex25)<->v155(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex24)<->v153(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex23)<->v151(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex22)<->v149(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex21)<->v147(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex20)<->v145(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex19)<->v143(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex18)<->v141(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex17)<->v139(VarCurr)).
% 121.85/120.85  all VarCurr (v135(VarCurr,bitIndex16)<->v137(VarCurr)).
% 121.85/120.85  all VarCurr B (range_29_14(B)-> (v103(VarCurr,B)<->v105(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_29_14(B)-> (v105(VarCurr,B)<->v107(VarCurr,B))).
% 121.85/120.85  all VarCurr B (range_29_14(B)-> (v107(VarCurr,B)<->v109(VarCurr,B))).
% 121.85/120.85  all B (range_29_14(B)<->bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B).
% 121.85/120.85  all VarCurr ((v109(VarCurr,bitIndex29)<->v111(VarCurr,bitIndex609))& (v109(VarCurr,bitIndex28)<->v111(VarCurr,bitIndex608))& (v109(VarCurr,bitIndex27)<->v111(VarCurr,bitIndex607))& (v109(VarCurr,bitIndex26)<->v111(VarCurr,bitIndex606))& (v109(VarCurr,bitIndex25)<->v111(VarCurr,bitIndex605))& (v109(VarCurr,bitIndex24)<->v111(VarCurr,bitIndex604))& (v109(VarCurr,bitIndex23)<->v111(VarCurr,bitIndex603))& (v109(VarCurr,bitIndex22)<->v111(VarCurr,bitIndex602))& (v109(VarCurr,bitIndex21)<->v111(VarCurr,bitIndex601))& (v109(VarCurr,bitIndex20)<->v111(VarCurr,bitIndex600))& (v109(VarCurr,bitIndex19)<->v111(VarCurr,bitIndex599))& (v109(VarCurr,bitIndex18)<->v111(VarCurr,bitIndex598))& (v109(VarCurr,bitIndex17)<->v111(VarCurr,bitIndex597))& (v109(VarCurr,bitIndex16)<->v111(VarCurr,bitIndex596))& (v109(VarCurr,bitIndex15)<->v111(VarCurr,bitIndex595))& (v109(VarCurr,bitIndex14)<->v111(VarCurr,bitIndex594))).
% 121.85/120.85  -v111(constB0,bitIndex695).
% 121.85/120.85  -v111(constB0,bitIndex694).
% 121.85/120.85  -v111(constB0,bitIndex693).
% 121.85/120.85  -v111(constB0,bitIndex692).
% 121.85/120.85  -v111(constB0,bitIndex691).
% 121.85/120.85  -v111(constB0,bitIndex690).
% 121.85/120.85  -v111(constB0,bitIndex689).
% 121.85/120.85  -v111(constB0,bitIndex654).
% 121.85/120.85  -v111(constB0,bitIndex641).
% 121.85/120.85  -v111(constB0,bitIndex640).
% 121.85/120.85  -v111(constB0,bitIndex639).
% 121.85/120.85  -v111(constB0,bitIndex638).
% 121.85/120.85  -v111(constB0,bitIndex637).
% 121.85/120.85  -v111(constB0,bitIndex636).
% 121.85/120.85  -v111(constB0,bitIndex635).
% 121.85/120.85  -v111(constB0,bitIndex634).
% 121.85/120.85  -v111(constB0,bitIndex633).
% 121.85/120.85  -v111(constB0,bitIndex632).
% 121.85/120.85  -v111(constB0,bitIndex631).
% 121.85/120.85  -v111(constB0,bitIndex630).
% 121.85/120.85  -v111(constB0,bitIndex629).
% 121.85/120.85  -v111(constB0,bitIndex628).
% 121.85/120.85  -v111(constB0,bitIndex627).
% 121.85/120.85  -v111(constB0,bitIndex626).
% 121.85/120.85  -v111(constB0,bitIndex625).
% 121.85/120.85  -v111(constB0,bitIndex624).
% 121.85/120.85  -v111(constB0,bitIndex623).
% 121.85/120.85  -v111(constB0,bitIndex622).
% 121.85/120.85  -v111(constB0,bitIndex621).
% 121.85/120.85  -v111(constB0,bitIndex620).
% 121.85/120.85  -v111(constB0,bitIndex619).
% 121.85/120.85  -v111(constB0,bitIndex618).
% 121.85/120.85  -v111(constB0,bitIndex617).
% 121.85/120.85  -v111(constB0,bitIndex616).
% 121.85/120.85  -v111(constB0,bitIndex615).
% 121.85/120.85  -v111(constB0,bitIndex614).
% 121.85/120.85  -v111(constB0,bitIndex613).
% 121.85/120.85  -v111(constB0,bitIndex612).
% 121.85/120.85  -v111(constB0,bitIndex611).
% 121.85/120.85  -v111(constB0,bitIndex610).
% 121.85/120.85  -v111(constB0,bitIndex609).
% 121.85/120.85  -v111(constB0,bitIndex608).
% 121.85/120.85  -v111(constB0,bitIndex607).
% 121.85/120.85  -v111(constB0,bitIndex606).
% 121.85/120.85  -v111(constB0,bitIndex605).
% 121.85/120.85  -v111(constB0,bitIndex604).
% 121.85/120.85  -v111(constB0,bitIndex603).
% 121.85/120.85  -v111(constB0,bitIndex602).
% 121.85/120.85  -v111(constB0,bitIndex601).
% 121.85/120.85  -v111(constB0,bitIndex600).
% 121.85/120.85  -v111(constB0,bitIndex599).
% 121.85/120.85  -v111(constB0,bitIndex598).
% 121.85/120.85  -v111(constB0,bitIndex597).
% 121.85/120.85  -v111(constB0,bitIndex596).
% 121.85/120.85  -v111(constB0,bitIndex595).
% 121.85/120.85  -v111(constB0,bitIndex594).
% 121.85/120.85  -v111(constB0,bitIndex590).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex115).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex114).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex113).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex112).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex111).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex110).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex109).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex74).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex61).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex60).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex59).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex58).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex57).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex56).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex55).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex54).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex53).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex52).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex51).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex50).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex49).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex48).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex47).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex46).
% 121.85/120.85  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex45).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex44).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex43).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex42).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex41).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex40).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex39).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex38).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex37).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex36).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex35).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex34).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex33).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex32).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex31).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex30).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex29).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex28).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex27).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex26).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex25).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex24).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex23).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex22).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex21).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex20).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex19).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex18).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex17).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex16).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex15).
% 131.45/130.45  -b0000000xxxxxxxxxxxxxxxxxx
% 131.45/130.46  Search stopped in tp_alloc by max_mem option.
% 131.45/130.46  xxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex14).
% 131.45/130.46  -b0000000xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx0xxxxxxxxxxxx000000000000000000000000000000000000000000000000xxx0xxxxxxxxxx(bitIndex10).
% 131.45/130.46  all VarCurr (v74(VarCurr)<->v76(VarCurr)).
% 131.45/130.46  all VarCurr (v76(VarCurr)<->v78(VarCurr)).
% 131.45/130.46  all VarCurr (v78(VarCurr)<->v80(VarCurr)).
% 131.45/130.46  all VarCurr (v80(VarCurr)<->v82(VarCurr)).
% 131.45/130.46  all VarCurr (-v82(VarCurr)<->v84(VarCurr,bitIndex0)).
% 131.45/130.46  all VarCurr (v84(VarCurr,bitIndex0)<->v86(VarCurr,bitIndex0)).
% 131.45/130.46  all VarCurr (v86(VarCurr,bitIndex0)<->v88(VarCurr,bitIndex0)).
% 131.45/130.46  -v88(constB0,bitIndex0).
% 131.45/130.46  -bxxxxx0(bitIndex0).
% 131.45/130.46  all VarCurr (v65(VarCurr)<->v67(VarCurr)).
% 131.45/130.46  all VarCurr (v67(VarCurr)<->v69(VarCurr)).
% 131.45/130.46  all VarCurr (v69(VarCurr)<->v14(VarCurr)).
% 131.45/130.46  all VarCurr (v43(VarCurr)<->v45(VarCurr)).
% 131.45/130.46  all VarCurr (v45(VarCurr)<->v10(VarCurr)).
% 131.45/130.46  all VarCurr (v27(VarCurr)<->v8(VarCurr)).
% 131.45/130.46  all VarCurr (v8(VarCurr)<->v10(VarCurr)).
% 131.45/130.46  all VarCurr (v10(VarCurr)<->v12(VarCurr)).
% 131.45/130.46  all VarCurr (v12(VarCurr)<->v14(VarCurr)).
% 131.45/130.46  all VarCurr (v14(VarCurr)<->v16(VarCurr)).
% 131.45/130.46  end_of_list.
% 131.45/130.46  
% 131.45/130.46  Search stopped in tp_alloc by max_mem option.
% 131.45/130.46  
% 131.45/130.46  ============ end of search ============
% 131.45/130.46  
% 131.45/130.46  -------------- statistics -------------
% 131.45/130.46  clauses given                  0
% 131.45/130.46  clauses generated              0
% 131.45/130.46  clauses kept                   0
% 131.45/130.46  clauses forward subsumed       0
% 131.45/130.46  clauses back subsumed          0
% 131.45/130.46  Kbytes malloced            11718
% 131.45/130.46  
% 131.45/130.46  ----------- times (seconds) -----------
% 131.45/130.46  user CPU time         14.09          (0 hr, 0 min, 14 sec)
% 131.45/130.46  system CPU time        0.01          (0 hr, 0 min, 0 sec)
% 131.45/130.46  wall-clock time      130             (0 hr, 2 min, 10 sec)
% 131.45/130.46  
% 131.45/130.46  Process 27546 finished Wed Jul 27 06:44:42 2022
% 131.45/130.46  Otter interrupted
% 131.45/130.46  PROOF NOT FOUND
%------------------------------------------------------------------------------