TSTP Solution File: HWV089+1 by Otter---3.3

View Problem - Process Solution

%------------------------------------------------------------------------------
% File     : Otter---3.3
% Problem  : HWV089+1 : TPTP v8.1.0. Released v6.1.0.
% Transfm  : none
% Format   : tptp:raw
% Command  : otter-tptp-script %s

% Computer : n021.cluster.edu
% Model    : x86_64 x86_64
% CPU      : Intel(R) Xeon(R) CPU E5-2620 v4 2.10GHz
% Memory   : 8042.1875MB
% OS       : Linux 3.10.0-693.el7.x86_64
% CPULimit : 300s
% WCLimit  : 300s
% DateTime : Wed Jul 27 12:58:34 EDT 2022

% Result   : Unknown 95.04s 94.41s
% Output   : None 
% Verified : 
% SZS Type : -

% Comments : 
%------------------------------------------------------------------------------
%----No solution output by system
%------------------------------------------------------------------------------
%----ORIGINAL SYSTEM OUTPUT
% 0.07/0.30  % Problem  : HWV089+1 : TPTP v8.1.0. Released v6.1.0.
% 0.07/0.31  % Command  : otter-tptp-script %s
% 0.16/0.52  % Computer : n021.cluster.edu
% 0.16/0.52  % Model    : x86_64 x86_64
% 0.16/0.52  % CPU      : Intel(R) Xeon(R) CPU E5-2620 v4 @ 2.10GHz
% 0.16/0.52  % Memory   : 8042.1875MB
% 0.16/0.52  % OS       : Linux 3.10.0-693.el7.x86_64
% 0.16/0.52  % CPULimit : 300
% 0.16/0.52  % WCLimit  : 300
% 0.16/0.52  % DateTime : Wed Jul 27 06:34:08 EDT 2022
% 0.16/0.52  % CPUTime  : 
% 93.85/93.24  ----- Otter 3.3f, August 2004 -----
% 93.85/93.24  The process was started by sandbox on n021.cluster.edu,
% 93.85/93.24  Wed Jul 27 06:34:08 2022
% 93.85/93.24  The command was "./otter".  The process ID is 8614.
% 93.85/93.24  
% 93.85/93.24  set(prolog_style_variables).
% 93.85/93.24  set(auto).
% 93.85/93.24     dependent: set(auto1).
% 93.85/93.24     dependent: set(process_input).
% 93.85/93.24     dependent: clear(print_kept).
% 93.85/93.24     dependent: clear(print_new_demod).
% 93.85/93.24     dependent: clear(print_back_demod).
% 93.85/93.24     dependent: clear(print_back_sub).
% 93.85/93.24     dependent: set(control_memory).
% 93.85/93.24     dependent: assign(max_mem, 12000).
% 93.85/93.24     dependent: assign(pick_given_ratio, 4).
% 93.85/93.24     dependent: assign(stats_level, 1).
% 93.85/93.24     dependent: assign(max_seconds, 10800).
% 93.85/93.24  clear(print_given).
% 93.85/93.24  
% 93.85/93.24  formula_list(usable).
% 93.85/93.24  all A (A=A).
% 93.85/93.24  nextState(constB8,constB9).
% 93.85/93.24  nextState(constB7,constB8).
% 93.85/93.24  nextState(constB6,constB7).
% 93.85/93.24  nextState(constB5,constB6).
% 93.85/93.24  nextState(constB4,constB5).
% 93.85/93.24  nextState(constB3,constB4).
% 93.85/93.24  nextState(constB2,constB3).
% 93.85/93.24  nextState(constB1,constB2).
% 93.85/93.24  nextState(constB0,constB1).
% 93.85/93.24  all VarNext VarCurr (nextState(VarCurr,VarNext)->reachableState(VarCurr)&reachableState(VarNext)).
% 93.85/93.24  all VarState (reachableState(VarState)->constB0=VarState|constB1=VarState|constB2=VarState|constB3=VarState|constB4=VarState|constB5=VarState|constB6=VarState|constB7=VarState|constB8=VarState|constB9=VarState|constB10=VarState|constB11=VarState|constB12=VarState|constB13=VarState|constB14=VarState|constB15=VarState|constB16=VarState|constB17=VarState|constB18=VarState|constB19=VarState|constB20=VarState).
% 93.85/93.24  reachableState(constB20).
% 93.85/93.24  reachableState(constB19).
% 93.85/93.24  reachableState(constB18).
% 93.85/93.24  reachableState(constB17).
% 93.85/93.24  reachableState(constB16).
% 93.85/93.24  reachableState(constB15).
% 93.85/93.24  reachableState(constB14).
% 93.85/93.24  reachableState(constB13).
% 93.85/93.24  reachableState(constB12).
% 93.85/93.24  reachableState(constB11).
% 93.85/93.24  reachableState(constB10).
% 93.85/93.24  reachableState(constB9).
% 93.85/93.24  reachableState(constB8).
% 93.85/93.24  reachableState(constB7).
% 93.85/93.24  reachableState(constB6).
% 93.85/93.24  reachableState(constB5).
% 93.85/93.24  reachableState(constB4).
% 93.85/93.24  reachableState(constB3).
% 93.85/93.24  reachableState(constB2).
% 93.85/93.24  reachableState(constB1).
% 93.85/93.24  reachableState(constB0).
% 93.85/93.24  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1(VarCurr)<-> -v1(VarNext))).
% 93.85/93.24  -v1(constB0).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_20,B)<->v1019(constB20,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_20).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB20,v1019_range_3_to_0_address_term_bound_20).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_19,B)<->v1019(constB19,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_19).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB19,v1019_range_3_to_0_address_term_bound_19).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_18,B)<->v1019(constB18,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_18).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB18,v1019_range_3_to_0_address_term_bound_18).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_17,B)<->v1019(constB17,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_17).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB17,v1019_range_3_to_0_address_term_bound_17).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_16,B)<->v1019(constB16,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_16).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB16,v1019_range_3_to_0_address_term_bound_16).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_15,B)<->v1019(constB15,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_15).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB15,v1019_range_3_to_0_address_term_bound_15).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_14,B)<->v1019(constB14,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_14).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB14,v1019_range_3_to_0_address_term_bound_14).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_13,B)<->v1019(constB13,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_13).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB13,v1019_range_3_to_0_address_term_bound_13).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_12,B)<->v1019(constB12,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_12).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB12,v1019_range_3_to_0_address_term_bound_12).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_11,B)<->v1019(constB11,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_11).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB11,v1019_range_3_to_0_address_term_bound_11).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_10,B)<->v1019(constB10,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_10).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB10,v1019_range_3_to_0_address_term_bound_10).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_9,B)<->v1019(constB9,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_9).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB9,v1019_range_3_to_0_address_term_bound_9).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_8,B)<->v1019(constB8,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_8).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB8,v1019_range_3_to_0_address_term_bound_8).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_7,B)<->v1019(constB7,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_7).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB7,v1019_range_3_to_0_address_term_bound_7).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_6,B)<->v1019(constB6,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_6).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB6,v1019_range_3_to_0_address_term_bound_6).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_5,B)<->v1019(constB5,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_5).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB5,v1019_range_3_to_0_address_term_bound_5).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_4,B)<->v1019(constB4,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_4).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB4,v1019_range_3_to_0_address_term_bound_4).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_3,B)<->v1019(constB3,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_3).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB3,v1019_range_3_to_0_address_term_bound_3).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_2,B)<->v1019(constB2,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_2).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB2,v1019_range_3_to_0_address_term_bound_2).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_1,B)<->v1019(constB1,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_1).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB1,v1019_range_3_to_0_address_term_bound_1).
% 93.85/93.24  all B (addressVal(v1019_range_3_to_0_address_term_bound_0,B)<->v1019(constB0,B)).
% 93.85/93.24  address(v1019_range_3_to_0_address_term_bound_0).
% 93.85/93.24  v1019_range_3_to_0_address_association(constB0,v1019_range_3_to_0_address_term_bound_0).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_20,B)<->v953(constB20,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_20).
% 93.85/93.24  v953_range_3_to_0_address_association(constB20,v953_range_3_to_0_address_term_bound_20).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_19,B)<->v953(constB19,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_19).
% 93.85/93.24  v953_range_3_to_0_address_association(constB19,v953_range_3_to_0_address_term_bound_19).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_18,B)<->v953(constB18,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_18).
% 93.85/93.24  v953_range_3_to_0_address_association(constB18,v953_range_3_to_0_address_term_bound_18).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_17,B)<->v953(constB17,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_17).
% 93.85/93.24  v953_range_3_to_0_address_association(constB17,v953_range_3_to_0_address_term_bound_17).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_16,B)<->v953(constB16,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_16).
% 93.85/93.24  v953_range_3_to_0_address_association(constB16,v953_range_3_to_0_address_term_bound_16).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_15,B)<->v953(constB15,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_15).
% 93.85/93.24  v953_range_3_to_0_address_association(constB15,v953_range_3_to_0_address_term_bound_15).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_14,B)<->v953(constB14,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_14).
% 93.85/93.24  v953_range_3_to_0_address_association(constB14,v953_range_3_to_0_address_term_bound_14).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_13,B)<->v953(constB13,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_13).
% 93.85/93.24  v953_range_3_to_0_address_association(constB13,v953_range_3_to_0_address_term_bound_13).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_12,B)<->v953(constB12,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_12).
% 93.85/93.24  v953_range_3_to_0_address_association(constB12,v953_range_3_to_0_address_term_bound_12).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_11,B)<->v953(constB11,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_11).
% 93.85/93.24  v953_range_3_to_0_address_association(constB11,v953_range_3_to_0_address_term_bound_11).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_10,B)<->v953(constB10,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_10).
% 93.85/93.24  v953_range_3_to_0_address_association(constB10,v953_range_3_to_0_address_term_bound_10).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_9,B)<->v953(constB9,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_9).
% 93.85/93.24  v953_range_3_to_0_address_association(constB9,v953_range_3_to_0_address_term_bound_9).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_8,B)<->v953(constB8,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_8).
% 93.85/93.24  v953_range_3_to_0_address_association(constB8,v953_range_3_to_0_address_term_bound_8).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_7,B)<->v953(constB7,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_7).
% 93.85/93.24  v953_range_3_to_0_address_association(constB7,v953_range_3_to_0_address_term_bound_7).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_6,B)<->v953(constB6,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_6).
% 93.85/93.24  v953_range_3_to_0_address_association(constB6,v953_range_3_to_0_address_term_bound_6).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_5,B)<->v953(constB5,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_5).
% 93.85/93.24  v953_range_3_to_0_address_association(constB5,v953_range_3_to_0_address_term_bound_5).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_4,B)<->v953(constB4,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_4).
% 93.85/93.24  v953_range_3_to_0_address_association(constB4,v953_range_3_to_0_address_term_bound_4).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_3,B)<->v953(constB3,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_3).
% 93.85/93.24  v953_range_3_to_0_address_association(constB3,v953_range_3_to_0_address_term_bound_3).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_2,B)<->v953(constB2,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_2).
% 93.85/93.24  v953_range_3_to_0_address_association(constB2,v953_range_3_to_0_address_term_bound_2).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_1,B)<->v953(constB1,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_1).
% 93.85/93.24  v953_range_3_to_0_address_association(constB1,v953_range_3_to_0_address_term_bound_1).
% 93.85/93.24  all B (addressVal(v953_range_3_to_0_address_term_bound_0,B)<->v953(constB0,B)).
% 93.85/93.24  address(v953_range_3_to_0_address_term_bound_0).
% 93.85/93.24  v953_range_3_to_0_address_association(constB0,v953_range_3_to_0_address_term_bound_0).
% 93.85/93.24  all B (addressVal(v869_range_3_to_0_address_term_bound_20,B)<->v869(constB20,B)).
% 93.85/93.24  address(v869_range_3_to_0_address_term_bound_20).
% 93.85/93.24  v869_range_3_to_0_address_association(constB20,v869_range_3_to_0_address_term_bound_20).
% 93.85/93.24  all B (addressVal(v869_range_3_to_0_address_term_bound_19,B)<->v869(constB19,B)).
% 93.85/93.24  address(v869_range_3_to_0_address_term_bound_19).
% 93.85/93.24  v869_range_3_to_0_address_association(constB19,v869_range_3_to_0_address_term_bound_19).
% 93.85/93.24  all B (addressVal(v869_range_3_to_0_address_term_bound_18,B)<->v869(constB18,B)).
% 93.85/93.24  address(v869_range_3_to_0_address_term_bound_18).
% 93.85/93.24  v869_range_3_to_0_address_association(constB18,v869_range_3_to_0_address_term_bound_18).
% 93.85/93.24  all B (addressVal(v869_range_3_to_0_address_term_bound_17,B)<->v869(constB17,B)).
% 93.85/93.24  address(v869_range_3_to_0_address_term_bound_17).
% 93.85/93.24  v869_range_3_to_0_address_association(constB17,v869_range_3_to_0_address_term_bound_17).
% 93.85/93.24  all B (addressVal(v869_range_3_to_0_address_term_bound_16,B)<->v869(constB16,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_16).
% 93.85/93.25  v869_range_3_to_0_address_association(constB16,v869_range_3_to_0_address_term_bound_16).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_15,B)<->v869(constB15,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_15).
% 93.85/93.25  v869_range_3_to_0_address_association(constB15,v869_range_3_to_0_address_term_bound_15).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_14,B)<->v869(constB14,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_14).
% 93.85/93.25  v869_range_3_to_0_address_association(constB14,v869_range_3_to_0_address_term_bound_14).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_13,B)<->v869(constB13,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_13).
% 93.85/93.25  v869_range_3_to_0_address_association(constB13,v869_range_3_to_0_address_term_bound_13).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_12,B)<->v869(constB12,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_12).
% 93.85/93.25  v869_range_3_to_0_address_association(constB12,v869_range_3_to_0_address_term_bound_12).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_11,B)<->v869(constB11,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_11).
% 93.85/93.25  v869_range_3_to_0_address_association(constB11,v869_range_3_to_0_address_term_bound_11).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_10,B)<->v869(constB10,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_10).
% 93.85/93.25  v869_range_3_to_0_address_association(constB10,v869_range_3_to_0_address_term_bound_10).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_9,B)<->v869(constB9,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_9).
% 93.85/93.25  v869_range_3_to_0_address_association(constB9,v869_range_3_to_0_address_term_bound_9).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_8,B)<->v869(constB8,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_8).
% 93.85/93.25  v869_range_3_to_0_address_association(constB8,v869_range_3_to_0_address_term_bound_8).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_7,B)<->v869(constB7,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_7).
% 93.85/93.25  v869_range_3_to_0_address_association(constB7,v869_range_3_to_0_address_term_bound_7).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_6,B)<->v869(constB6,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_6).
% 93.85/93.25  v869_range_3_to_0_address_association(constB6,v869_range_3_to_0_address_term_bound_6).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_5,B)<->v869(constB5,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_5).
% 93.85/93.25  v869_range_3_to_0_address_association(constB5,v869_range_3_to_0_address_term_bound_5).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_4,B)<->v869(constB4,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_4).
% 93.85/93.25  v869_range_3_to_0_address_association(constB4,v869_range_3_to_0_address_term_bound_4).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_3,B)<->v869(constB3,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_3).
% 93.85/93.25  v869_range_3_to_0_address_association(constB3,v869_range_3_to_0_address_term_bound_3).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_2,B)<->v869(constB2,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_2).
% 93.85/93.25  v869_range_3_to_0_address_association(constB2,v869_range_3_to_0_address_term_bound_2).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_1,B)<->v869(constB1,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_1).
% 93.85/93.25  v869_range_3_to_0_address_association(constB1,v869_range_3_to_0_address_term_bound_1).
% 93.85/93.25  all B (addressVal(v869_range_3_to_0_address_term_bound_0,B)<->v869(constB0,B)).
% 93.85/93.25  address(v869_range_3_to_0_address_term_bound_0).
% 93.85/93.25  v869_range_3_to_0_address_association(constB0,v869_range_3_to_0_address_term_bound_0).
% 93.85/93.25  address(b1110_address_term).
% 93.85/93.25  all B (addressVal(b1110_address_term,B)<->b1110(B)).
% 93.85/93.25  address(b1101_address_term).
% 93.85/93.25  all B (addressVal(b1101_address_term,B)<->b1101(B)).
% 93.85/93.25  address(b1100_address_term).
% 93.85/93.25  all B (addressVal(b1100_address_term,B)<->b1100(B)).
% 93.85/93.25  address(b1011_address_term).
% 93.85/93.25  all B (addressVal(b1011_address_term,B)<->b1011(B)).
% 93.85/93.25  address(b1010_address_term).
% 93.85/93.25  all B (addressVal(b1010_address_term,B)<->b1010(B)).
% 93.85/93.25  address(b1001_address_term).
% 93.85/93.25  all B (addressVal(b1001_address_term,B)<->b1001(B)).
% 93.85/93.25  address(b1000_address_term).
% 93.85/93.25  all B (addressVal(b1000_address_term,B)<->b1000(B)).
% 93.85/93.25  address(b0111_address_term).
% 93.85/93.25  all B (addressVal(b0111_address_term,B)<->b0111(B)).
% 93.85/93.25  address(b0100_address_term).
% 93.85/93.25  all B (addressVal(b0100_address_term,B)<->b0100(B)).
% 93.85/93.25  address(b0011_address_term).
% 93.85/93.25  all B (addressVal(b0011_address_term,B)<->b0011(B)).
% 93.85/93.25  address(b0010_address_term).
% 93.85/93.25  all B (addressVal(b0010_address_term,B)<->b0010(B)).
% 93.85/93.25  address(b1111_address_term).
% 93.85/93.25  all B (addressVal(b1111_address_term,B)<->b1111(B)).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_20,B)<->v791(constB20,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_20).
% 93.85/93.25  v791_range_3_to_0_address_association(constB20,v791_range_3_to_0_address_term_bound_20).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_19,B)<->v791(constB19,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_19).
% 93.85/93.25  v791_range_3_to_0_address_association(constB19,v791_range_3_to_0_address_term_bound_19).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_18,B)<->v791(constB18,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_18).
% 93.85/93.25  v791_range_3_to_0_address_association(constB18,v791_range_3_to_0_address_term_bound_18).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_17,B)<->v791(constB17,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_17).
% 93.85/93.25  v791_range_3_to_0_address_association(constB17,v791_range_3_to_0_address_term_bound_17).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_16,B)<->v791(constB16,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_16).
% 93.85/93.25  v791_range_3_to_0_address_association(constB16,v791_range_3_to_0_address_term_bound_16).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_15,B)<->v791(constB15,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_15).
% 93.85/93.25  v791_range_3_to_0_address_association(constB15,v791_range_3_to_0_address_term_bound_15).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_14,B)<->v791(constB14,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_14).
% 93.85/93.25  v791_range_3_to_0_address_association(constB14,v791_range_3_to_0_address_term_bound_14).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_13,B)<->v791(constB13,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_13).
% 93.85/93.25  v791_range_3_to_0_address_association(constB13,v791_range_3_to_0_address_term_bound_13).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_12,B)<->v791(constB12,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_12).
% 93.85/93.25  v791_range_3_to_0_address_association(constB12,v791_range_3_to_0_address_term_bound_12).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_11,B)<->v791(constB11,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_11).
% 93.85/93.25  v791_range_3_to_0_address_association(constB11,v791_range_3_to_0_address_term_bound_11).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_10,B)<->v791(constB10,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_10).
% 93.85/93.25  v791_range_3_to_0_address_association(constB10,v791_range_3_to_0_address_term_bound_10).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_9,B)<->v791(constB9,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_9).
% 93.85/93.25  v791_range_3_to_0_address_association(constB9,v791_range_3_to_0_address_term_bound_9).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_8,B)<->v791(constB8,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_8).
% 93.85/93.25  v791_range_3_to_0_address_association(constB8,v791_range_3_to_0_address_term_bound_8).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_7,B)<->v791(constB7,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_7).
% 93.85/93.25  v791_range_3_to_0_address_association(constB7,v791_range_3_to_0_address_term_bound_7).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_6,B)<->v791(constB6,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_6).
% 93.85/93.25  v791_range_3_to_0_address_association(constB6,v791_range_3_to_0_address_term_bound_6).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_5,B)<->v791(constB5,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_5).
% 93.85/93.25  v791_range_3_to_0_address_association(constB5,v791_range_3_to_0_address_term_bound_5).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_4,B)<->v791(constB4,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_4).
% 93.85/93.25  v791_range_3_to_0_address_association(constB4,v791_range_3_to_0_address_term_bound_4).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_3,B)<->v791(constB3,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_3).
% 93.85/93.25  v791_range_3_to_0_address_association(constB3,v791_range_3_to_0_address_term_bound_3).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_2,B)<->v791(constB2,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_2).
% 93.85/93.25  v791_range_3_to_0_address_association(constB2,v791_range_3_to_0_address_term_bound_2).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_1,B)<->v791(constB1,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_1).
% 93.85/93.25  v791_range_3_to_0_address_association(constB1,v791_range_3_to_0_address_term_bound_1).
% 93.85/93.25  all B (addressVal(v791_range_3_to_0_address_term_bound_0,B)<->v791(constB0,B)).
% 93.85/93.25  address(v791_range_3_to_0_address_term_bound_0).
% 93.85/93.25  v791_range_3_to_0_address_association(constB0,v791_range_3_to_0_address_term_bound_0).
% 93.85/93.25  address(b0101_address_term).
% 93.85/93.25  all B (addressVal(b0101_address_term,B)<->b0101(B)).
% 93.85/93.25  address(b0001_address_term).
% 93.85/93.25  all B (addressVal(b0001_address_term,B)<->b0001(B)).
% 93.85/93.25  address(b0110_address_term).
% 93.85/93.25  all B (addressVal(b0110_address_term,B)<->b0110(B)).
% 93.85/93.25  address(b0000_address_term).
% 93.85/93.25  all B (addressVal(b0000_address_term,B)<->b0000(B)).
% 93.85/93.25  all B A2 A1 (address(A1)&address(A2)&addressDiff(A1,A2,B)->A1=A2| (addressVal(A1,B)<-> -addressVal(A2,B))).
% 93.85/93.25  all A1 A2 (addressDiff(A1,A2,bitIndex0)|addressDiff(A1,A2,bitIndex1)|addressDiff(A1,A2,bitIndex2)|addressDiff(A1,A2,bitIndex3)).
% 93.85/93.25  -(all VarCurr (reachableState(VarCurr)->v4(VarCurr))).
% 93.85/93.25  all VarCurr (-v4(VarCurr)<->v3674(VarCurr)).
% 93.85/93.25  all VarCurr (-v3674(VarCurr)<->v3675(VarCurr)).
% 93.85/93.25  all VarCurr (v3675(VarCurr)<->v3677(VarCurr)&v3693(VarCurr)).
% 93.85/93.25  all VarCurr (v3693(VarCurr)<->v3679(VarCurr,bitIndex0)|v3679(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (-v3677(VarCurr)<->v3678(VarCurr)).
% 93.85/93.25  all VarCurr (v3678(VarCurr)<->v3679(VarCurr,bitIndex0)&v3679(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (v3679(VarCurr,bitIndex0)<->v3680(VarCurr)).
% 93.85/93.25  all VarCurr (v3679(VarCurr,bitIndex1)<->$T).
% 93.85/93.25  all VarCurr (v3680(VarCurr)<->v3682(VarCurr)&v3684(VarCurr,bitIndex5)).
% 93.85/93.25  all VarCurr (v3682(VarCurr)<->v3683(VarCurr)&v3684(VarCurr,bitIndex4)).
% 93.85/93.25  all VarCurr (v3683(VarCurr)<->v3684(VarCurr,bitIndex3)|v3685(VarCurr)).
% 93.85/93.25  all VarCurr (v3685(VarCurr)<->v3686(VarCurr)&v3692(VarCurr)).
% 93.85/93.25  all VarCurr (-v3692(VarCurr)<->v3684(VarCurr,bitIndex3)).
% 93.85/93.25  all VarCurr (v3686(VarCurr)<->v3684(VarCurr,bitIndex2)|v3687(VarCurr)).
% 93.85/93.25  all VarCurr (v3687(VarCurr)<->v3688(VarCurr)&v3691(VarCurr)).
% 93.85/93.25  all VarCurr (-v3691(VarCurr)<->v3684(VarCurr,bitIndex2)).
% 93.85/93.25  all VarCurr (v3688(VarCurr)<->v3684(VarCurr,bitIndex1)|v3689(VarCurr)).
% 93.85/93.25  all VarCurr (v3689(VarCurr)<->v3684(VarCurr,bitIndex0)&v3690(VarCurr)).
% 93.85/93.25  all VarCurr (-v3690(VarCurr)<->v3684(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (-v3684(VarCurr,bitIndex3)).
% 93.85/93.25  all VarCurr (-v3684(VarCurr,bitIndex4)).
% 93.85/93.25  all VarCurr (-v3684(VarCurr,bitIndex5)).
% 93.85/93.25  all VarCurr B (range_2_0(B)-> (v3684(VarCurr,B)<->v8(VarCurr,B))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3660(VarNext)-> (all B (range_2_0(B)-> (v8(VarNext,B)<->v8(VarCurr,B)))))).
% 93.85/93.25  all VarNext (v3660(VarNext)-> (all B (range_2_0(B)-> (v8(VarNext,B)<->v3668(VarNext,B))))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v3668(VarNext,B)<->v3666(VarCurr,B))))).
% 93.85/93.25  all VarCurr (-v3669(VarCurr)-> (all B (range_2_0(B)-> (v3666(VarCurr,B)<->v21(VarCurr,B))))).
% 93.85/93.25  all VarCurr (v3669(VarCurr)-> (all B (range_2_0(B)-> (v3666(VarCurr,B)<->$F)))).
% 93.85/93.25  all VarCurr (-v3669(VarCurr)<->v10(VarCurr)).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3660(VarNext)<->v3661(VarNext))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3661(VarNext)<->v3662(VarNext)&v286(VarNext))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3662(VarNext)<->v295(VarNext))).
% 93.85/93.25  all VarCurr (-v23(VarCurr)-> (all B (range_2_0(B)-> (v21(VarCurr,B)<->v8(VarCurr,B))))).
% 93.85/93.25  all VarCurr (v23(VarCurr)-> (all B (range_2_0(B)-> (v21(VarCurr,B)<->v3643(VarCurr,B))))).
% 93.85/93.25  all VarCurr (-v3644(VarCurr)-> (all B (range_2_0(B)-> (v3643(VarCurr,B)<->v3645(VarCurr,B))))).
% 93.85/93.25  all VarCurr (v3644(VarCurr)-> (all B (range_2_0(B)-> (v3643(VarCurr,B)<->$F)))).
% 93.85/93.25  -b000(bitIndex2).
% 93.85/93.25  -b000(bitIndex1).
% 93.85/93.25  -b000(bitIndex0).
% 93.85/93.25  all VarCurr (v3645(VarCurr,bitIndex0)<->v3655(VarCurr)).
% 93.85/93.25  all VarCurr (v3645(VarCurr,bitIndex1)<->v3653(VarCurr)).
% 93.85/93.25  all VarCurr (v3645(VarCurr,bitIndex2)<->v3647(VarCurr)).
% 93.85/93.25  all VarCurr (v3653(VarCurr)<->v3654(VarCurr)&v3657(VarCurr)).
% 93.85/93.25  all VarCurr (v3657(VarCurr)<->v8(VarCurr,bitIndex0)|v8(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (v3654(VarCurr)<->v3655(VarCurr)|v3656(VarCurr)).
% 93.85/93.25  all VarCurr (-v3656(VarCurr)<->v8(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (-v3655(VarCurr)<->v8(VarCurr,bitIndex0)).
% 93.85/93.25  all VarCurr (v3647(VarCurr)<->v3648(VarCurr)&v3652(VarCurr)).
% 93.85/93.25  all VarCurr (v3652(VarCurr)<->v3650(VarCurr)|v8(VarCurr,bitIndex2)).
% 93.85/93.25  all VarCurr (v3648(VarCurr)<->v3649(VarCurr)|v3651(VarCurr)).
% 93.85/93.25  all VarCurr (-v3651(VarCurr)<->v8(VarCurr,bitIndex2)).
% 93.85/93.25  all VarCurr (-v3649(VarCurr)<->v3650(VarCurr)).
% 93.85/93.25  all VarCurr (v3650(VarCurr)<->v8(VarCurr,bitIndex0)&v8(VarCurr,bitIndex1)).
% 93.85/93.25  all VarCurr (v3644(VarCurr)<-> (v8(VarCurr,bitIndex2)<->$T)& (v8(VarCurr,bitIndex1)<->$F)& (v8(VarCurr,bitIndex0)<->$T)).
% 93.85/93.25  b101(bitIndex2).
% 93.85/93.25  -b101(bitIndex1).
% 93.85/93.25  b101(bitIndex0).
% 93.85/93.25  all VarCurr (v23(VarCurr)<->v25(VarCurr)).
% 93.85/93.25  all VarCurr (v25(VarCurr)<->v27(VarCurr)).
% 93.85/93.25  all VarCurr (v27(VarCurr)<->v29(VarCurr)).
% 93.85/93.25  all VarCurr (v29(VarCurr)<->v31(VarCurr,bitIndex7)).
% 93.85/93.25  all VarNext (v31(VarNext,bitIndex7)<->v3635(VarNext,bitIndex6)).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3636(VarNext)-> (v3635(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v3635(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v3635(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v3635(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v3635(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v3635(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v3635(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v3635(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v3635(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v3635(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v3635(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 93.85/93.25  all VarNext (v3636(VarNext)-> (all B (range_10_0(B)-> (v3635(VarNext,B)<->v1253(VarNext,B))))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3636(VarNext)<->v3637(VarNext))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3637(VarNext)<->v3639(VarNext)&v1240(VarNext))).
% 93.85/93.25  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3639(VarNext)<->v1247(VarNext))).
% 93.85/93.25  all VarCurr (-v3611(VarCurr)-> (v36(VarCurr,bitIndex7)<->$F)).
% 93.85/93.25  all VarCurr (v3611(VarCurr)-> (v36(VarCurr,bitIndex7)<->$T)).
% 93.85/93.25  all VarCurr (v3611(VarCurr)<->v3612(VarCurr)|v3632(VarCurr)).
% 93.85/93.25  all VarCurr (v3632(VarCurr)<->v3633(VarCurr)&v1323(VarCurr)).
% 93.85/93.25  all VarCurr (v3633(VarCurr)<->v3619(VarCurr)).
% 93.85/93.25  all VarCurr (v3612(VarCurr)<->v3613(VarCurr)|v3630(VarCurr)).
% 93.85/93.25  all VarCurr (v3630(VarCurr)<->v3631(VarCurr)&v1300(VarCurr)).
% 93.85/93.25  all VarCurr (v3631(VarCurr)<->v3619(VarCurr)&v1180(VarCurr)).
% 93.85/93.25  all VarCurr (v3613(VarCurr)<->v3614(VarCurr)|v3628(VarCurr)).
% 93.85/93.25  all VarCurr (v3628(VarCurr)<->v3629(VarCurr)&v1360(VarCurr)).
% 93.85/93.25  all VarCurr (v3629(VarCurr)<->v3619(VarCurr)).
% 93.85/93.25  all VarCurr (v3614(VarCurr)<->v3615(VarCurr)|v3626(VarCurr)).
% 93.85/93.25  all VarCurr (v3626(VarCurr)<->v3627(VarCurr)&v1278(VarCurr)).
% 93.85/93.25  all VarCurr (v3627(VarCurr)<->v3619(VarCurr)&v1180(VarCurr)).
% 93.85/93.25  all VarCurr (v3615(VarCurr)<->v3616(VarCurr)|v3624(VarCurr)).
% 93.85/93.25  all VarCurr (v3624(VarCurr)<->v3625(VarCurr)&v1355(VarCurr)).
% 93.85/93.25  all VarCurr (v3625(VarCurr)<->v3619(VarCurr)).
% 93.85/93.25  all VarCurr (v3616(VarCurr)<->v3617(VarCurr)|v3621(VarCurr)).
% 93.85/93.25  all VarCurr (v3621(VarCurr)<->v3622(VarCurr)&v1238(VarCurr)).
% 93.85/93.25  all VarCurr (v3622(VarCurr)<->v3619(VarCurr)&v1180(VarCurr)).
% 93.85/93.25  all VarCurr (v3619(VarCurr)<->v3620(VarCurr)&v1347(VarCurr)).
% 93.85/93.25  all VarCurr (v3617(VarCurr)<->v3618(VarCurr)&v1348(VarCurr)).
% 93.85/93.25  all VarCurr (v3618(VarCurr)<->v3620(VarCurr)&v1347(VarCurr)).
% 93.85/93.25  all VarCurr (v3620(VarCurr)<->v1673(VarCurr)&v903(VarCurr)).
% 93.85/93.25  all VarCurr (v38(VarCurr)<->v40(VarCurr)).
% 93.85/93.25  all VarCurr (v40(VarCurr)<->v42(VarCurr)).
% 93.85/93.25  all VarCurr (v42(VarCurr)<->v44(VarCurr)).
% 93.85/93.25  all VarCurr (v44(VarCurr)<->v46(VarCurr)).
% 93.85/93.26  all VarCurr (v46(VarCurr)<->v48(VarCurr)).
% 93.85/93.26  all VarCurr (v48(VarCurr)<->v50(VarCurr)).
% 93.85/93.26  all VarCurr (v50(VarCurr)<->v52(VarCurr)).
% 93.85/93.26  all VarCurr (v52(VarCurr)<->v54(VarCurr)).
% 93.85/93.26  all VarCurr (v54(VarCurr)<->v56(VarCurr,bitIndex2)).
% 93.85/93.26  all VarNext (v56(VarNext,bitIndex2)<->v3601(VarNext,bitIndex2)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3602(VarNext)-> (all B (range_3_0(B)-> (v3601(VarNext,B)<->v56(VarCurr,B)))))).
% 93.85/93.26  all VarNext (v3602(VarNext)-> (all B (range_3_0(B)-> (v3601(VarNext,B)<->v3588(VarNext,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3602(VarNext)<->v3603(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3603(VarNext)<->v3605(VarNext)&v3573(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3605(VarNext)<->v3582(VarNext))).
% 93.85/93.26  all VarCurr (v67(VarCurr,bitIndex2)<->v3558(VarCurr,bitIndex2)).
% 93.85/93.26  all VarCurr (v3555(VarCurr,bitIndex2)<->v3556(VarCurr,bitIndex2)).
% 93.85/93.26  all VarNext (v56(VarNext,bitIndex1)<->v3593(VarNext,bitIndex1)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3594(VarNext)-> (all B (range_3_0(B)-> (v3593(VarNext,B)<->v56(VarCurr,B)))))).
% 93.85/93.26  all VarNext (v3594(VarNext)-> (all B (range_3_0(B)-> (v3593(VarNext,B)<->v3588(VarNext,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3594(VarNext)<->v3595(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3595(VarNext)<->v3597(VarNext)&v3573(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3597(VarNext)<->v3582(VarNext))).
% 93.85/93.26  all VarCurr (v67(VarCurr,bitIndex1)<->v3558(VarCurr,bitIndex1)).
% 93.85/93.26  all VarCurr (v3555(VarCurr,bitIndex1)<->v3556(VarCurr,bitIndex1)).
% 93.85/93.26  all VarNext (v56(VarNext,bitIndex3)<->v3577(VarNext,bitIndex3)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3578(VarNext)-> (all B (range_3_0(B)-> (v3577(VarNext,B)<->v56(VarCurr,B)))))).
% 93.85/93.26  all VarNext (v3578(VarNext)-> (all B (range_3_0(B)-> (v3577(VarNext,B)<->v3588(VarNext,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v3588(VarNext,B)<->v3586(VarCurr,B))))).
% 93.85/93.26  all VarCurr (-v3589(VarCurr)-> (all B (range_3_0(B)-> (v3586(VarCurr,B)<->v67(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3589(VarCurr)-> (all B (range_3_0(B)-> (v3586(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (-v3589(VarCurr)<->v58(VarCurr)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3578(VarNext)<->v3579(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3579(VarNext)<->v3580(VarNext)&v3573(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3580(VarNext)<->v3582(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3582(VarNext)<->v3573(VarCurr))).
% 93.85/93.26  all VarCurr (v3573(VarCurr)<->v3575(VarCurr)).
% 93.85/93.26  all VarCurr (v3575(VarCurr)<->v3531(VarCurr)).
% 93.85/93.26  all VarCurr (v67(VarCurr,bitIndex3)<->v3558(VarCurr,bitIndex3)).
% 93.85/93.26  all VarCurr (-v3559(VarCurr)-> (all B (range_3_0(B)-> (v3558(VarCurr,B)<->v3560(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3559(VarCurr)-> (all B (range_3_0(B)-> (v3558(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (-v3561(VarCurr)& -v3563(VarCurr)& -v3567(VarCurr)-> (all B (range_3_0(B)-> (v3560(VarCurr,B)<->v56(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3567(VarCurr)-> (all B (range_3_0(B)-> (v3560(VarCurr,B)<->v3569(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3563(VarCurr)-> (all B (range_3_0(B)-> (v3560(VarCurr,B)<->v3565(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3561(VarCurr)-> (all B (range_3_0(B)-> (v3560(VarCurr,B)<->v56(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3570(VarCurr)<-> (v3571(VarCurr,bitIndex1)<->$T)& (v3571(VarCurr,bitIndex0)<->$T)).
% 93.85/93.26  all VarCurr (v3571(VarCurr,bitIndex0)<->v3447(VarCurr)).
% 93.85/93.26  all VarCurr (v3571(VarCurr,bitIndex1)<->v69(VarCurr)).
% 93.85/93.26  all VarCurr (v3569(VarCurr,bitIndex0)<->$T).
% 93.85/93.26  all VarCurr B (range_3_1(B)-> (v3569(VarCurr,B)<->v3555(VarCurr,B))).
% 93.85/93.26  all B (range_3_1(B)<->bitIndex1=B|bitIndex2=B|bitIndex3=B).
% 93.85/93.26  all VarCurr (v3567(VarCurr)<-> (v3568(VarCurr,bitIndex1)<->$T)& (v3568(VarCurr,bitIndex0)<->$F)).
% 93.85/93.26  all VarCurr (v3568(VarCurr,bitIndex0)<->v3447(VarCurr)).
% 93.85/93.26  all VarCurr (v3568(VarCurr,bitIndex1)<->v69(VarCurr)).
% 93.85/93.26  all VarCurr ((v3565(VarCurr,bitIndex2)<->v56(VarCurr,bitIndex3))& (v3565(VarCurr,bitIndex1)<->v56(VarCurr,bitIndex2))& (v3565(VarCurr,bitIndex0)<->v56(VarCurr,bitIndex1))).
% 93.85/93.26  all VarCurr (v3565(VarCurr,bitIndex3)<->$F).
% 93.85/93.26  all VarCurr (v3563(VarCurr)<-> (v3564(VarCurr,bitIndex1)<->$F)& (v3564(VarCurr,bitIndex0)<->$T)).
% 93.85/93.26  all VarCurr (v3564(VarCurr,bitIndex0)<->v3447(VarCurr)).
% 93.85/93.26  all VarCurr (v3564(VarCurr,bitIndex1)<->v69(VarCurr)).
% 93.85/93.26  all VarCurr (v3561(VarCurr)<-> (v3562(VarCurr,bitIndex1)<->$F)& (v3562(VarCurr,bitIndex0)<->$F)).
% 93.85/93.26  all VarCurr (v3562(VarCurr,bitIndex0)<->v3447(VarCurr)).
% 93.85/93.26  all VarCurr (v3562(VarCurr,bitIndex1)<->v69(VarCurr)).
% 93.85/93.26  all VarCurr (-v3559(VarCurr)<->v58(VarCurr)).
% 93.85/93.26  all VarCurr (v3555(VarCurr,bitIndex3)<->v3556(VarCurr,bitIndex3)).
% 93.85/93.26  all VarCurr (v3556(VarCurr,bitIndex0)<->$F).
% 93.85/93.26  all VarCurr ((v3556(VarCurr,bitIndex3)<->v56(VarCurr,bitIndex2))& (v3556(VarCurr,bitIndex2)<->v56(VarCurr,bitIndex1))& (v3556(VarCurr,bitIndex1)<->v56(VarCurr,bitIndex0))).
% 93.85/93.26  all B (range_3_0(B)-> (v56(constB0,B)<->$F)).
% 93.85/93.26  all VarCurr (v3447(VarCurr)<->v3449(VarCurr)).
% 93.85/93.26  all VarCurr (v3449(VarCurr)<->v3451(VarCurr)).
% 93.85/93.26  all VarCurr (-v3551(VarCurr)& -v3552(VarCurr)-> (v3451(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (v3552(VarCurr)-> (v3451(VarCurr)<->$T)).
% 93.85/93.26  all VarCurr (v3551(VarCurr)-> (v3451(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (v3552(VarCurr)<-> (v3453(VarCurr,bitIndex1)<->$F)& (v3453(VarCurr,bitIndex0)<->$T)).
% 93.85/93.26  all VarCurr (v3551(VarCurr)<-> (v3453(VarCurr,bitIndex1)<->$F)& (v3453(VarCurr,bitIndex0)<->$F)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3536(VarNext)-> (all B (range_1_0(B)-> (v3453(VarNext,B)<->v3453(VarCurr,B)))))).
% 93.85/93.26  all VarNext (v3536(VarNext)-> (all B (range_1_0(B)-> (v3453(VarNext,B)<->v3546(VarNext,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_1_0(B)-> (v3546(VarNext,B)<->v3544(VarCurr,B))))).
% 93.85/93.26  all VarCurr (-v3547(VarCurr)-> (all B (range_1_0(B)-> (v3544(VarCurr,B)<->v3455(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3547(VarCurr)-> (all B (range_1_0(B)-> (v3544(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (v3547(VarCurr)<-> (v62(VarCurr)<->$F)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3536(VarNext)<->v3537(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3537(VarNext)<->v3538(VarNext)&v3531(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3538(VarNext)<->v3540(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3540(VarNext)<->v3531(VarCurr))).
% 93.85/93.26  all VarCurr (v3531(VarCurr)<->v3533(VarCurr)).
% 93.85/93.26  all VarCurr (v3533(VarCurr)<->v1(VarCurr)).
% 93.85/93.26  all VarCurr (-v3520(VarCurr)& -v3529(VarCurr)-> (all B (range_1_0(B)-> (v3455(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (v3529(VarCurr)-> (all B (range_1_0(B)-> (v3455(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (v3520(VarCurr)-> (all B (range_1_0(B)-> (v3455(VarCurr,B)<->v3521(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3529(VarCurr)<-> (v3453(VarCurr,bitIndex1)<->$F)& (v3453(VarCurr,bitIndex0)<->$T)).
% 93.85/93.26  all VarCurr (-v3522(VarCurr)-> (all B (range_1_0(B)-> (v3521(VarCurr,B)<->v3524(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3522(VarCurr)-> (all B (range_1_0(B)-> (v3521(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (-v3525(VarCurr)-> (all B (range_1_0(B)-> (v3524(VarCurr,B)<->b01(B))))).
% 93.85/93.26  all VarCurr (v3525(VarCurr)-> (all B (range_1_0(B)-> (v3524(VarCurr,B)<->$F)))).
% 93.85/93.26  all VarCurr (v3527(VarCurr)<-> (v3528(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (v3528(VarCurr)<->v3500(VarCurr)|v3502(VarCurr)).
% 93.85/93.26  all VarCurr (v3525(VarCurr)<-> (v3526(VarCurr)<->$T)).
% 93.85/93.26  all VarCurr (v3526(VarCurr)<->v3500(VarCurr)|v3502(VarCurr)).
% 93.85/93.26  v3500(constB0)<->$F.
% 93.85/93.26  all VarCurr (v3523(VarCurr)<-> (v3457(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (v3522(VarCurr)<-> (v3457(VarCurr)<->$T)).
% 93.85/93.26  all VarCurr (v3520(VarCurr)<-> (v3453(VarCurr,bitIndex1)<->$F)& (v3453(VarCurr,bitIndex0)<->$F)).
% 93.85/93.26  all B (range_1_0(B)-> (v3453(constB0,B)<->$F)).
% 93.85/93.26  all VarCurr (v3502(VarCurr)<->v3504(VarCurr)).
% 93.85/93.26  all VarCurr (v3504(VarCurr)<->v3506(VarCurr)).
% 93.85/93.26  all VarCurr (v3506(VarCurr)<->v3508(VarCurr)).
% 93.85/93.26  all VarCurr (v3508(VarCurr)<->v3510(VarCurr)).
% 93.85/93.26  all VarCurr (v3510(VarCurr)<->v3512(VarCurr)).
% 93.85/93.26  all VarCurr (v3512(VarCurr)<->v3514(VarCurr)).
% 93.85/93.26  all VarCurr (v3514(VarCurr)<->v3516(VarCurr,bitIndex6)).
% 93.85/93.26  -v3516(constB0,bitIndex6).
% 93.85/93.26  -bx0xxxxxx(bitIndex6).
% 93.85/93.26  all VarCurr (v3457(VarCurr)<->v3459(VarCurr)).
% 93.85/93.26  all VarCurr (v3459(VarCurr)<->v3493(VarCurr)&v3489(VarCurr)).
% 93.85/93.26  all VarCurr (v3493(VarCurr)<->v3494(VarCurr)&v3485(VarCurr)).
% 93.85/93.26  all VarCurr (v3494(VarCurr)<->v3495(VarCurr)&v3481(VarCurr)).
% 93.85/93.26  all VarCurr (v3495(VarCurr)<->v3496(VarCurr)&v3477(VarCurr)).
% 93.85/93.26  all VarCurr (v3496(VarCurr)<->v3497(VarCurr)&v3473(VarCurr)).
% 93.85/93.26  all VarCurr (v3497(VarCurr)<->v3498(VarCurr)&v3469(VarCurr)).
% 93.85/93.26  all VarCurr (v3498(VarCurr)<->v3461(VarCurr)&v3465(VarCurr)).
% 93.85/93.26  all VarCurr (v3489(VarCurr)<->v3491(VarCurr)).
% 93.85/93.26  v3491(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3485(VarCurr)<->v3487(VarCurr)).
% 93.85/93.26  v3487(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3481(VarCurr)<->v3483(VarCurr)).
% 93.85/93.26  v3483(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3477(VarCurr)<->v3479(VarCurr)).
% 93.85/93.26  v3479(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3473(VarCurr)<->v3475(VarCurr)).
% 93.85/93.26  v3475(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3469(VarCurr)<->v3471(VarCurr)).
% 93.85/93.26  v3471(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3465(VarCurr)<->v3467(VarCurr)).
% 93.85/93.26  v3467(constB0)<->$T.
% 93.85/93.26  all VarCurr (v3461(VarCurr)<->v3463(VarCurr)).
% 93.85/93.26  v3463(constB0)<->$T.
% 93.85/93.26  all VarCurr (v69(VarCurr)<->v71(VarCurr)).
% 93.85/93.26  all VarCurr (v71(VarCurr)<->v73(VarCurr)).
% 93.85/93.26  all VarCurr (v73(VarCurr)<->v75(VarCurr)).
% 93.85/93.26  all VarCurr (v75(VarCurr)<->v77(VarCurr)).
% 93.85/93.26  all VarCurr (v77(VarCurr)<->v79(VarCurr)).
% 93.85/93.26  all VarCurr (v79(VarCurr)<->v81(VarCurr)).
% 93.85/93.26  all VarCurr (v81(VarCurr)<->v83(VarCurr)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3426(VarNext)-> (v83(VarNext)<->v83(VarCurr)))).
% 93.85/93.26  all VarNext (v3426(VarNext)-> (v83(VarNext)<->v3434(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3434(VarNext)<->v3432(VarCurr))).
% 93.85/93.26  all VarCurr (-v3435(VarCurr)-> (v3432(VarCurr)<->v3436(VarCurr))).
% 93.85/93.26  all VarCurr (v3435(VarCurr)-> (v3432(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (-v3437(VarCurr)-> (v3436(VarCurr)<->$F)).
% 93.85/93.26  all VarCurr (v3437(VarCurr)-> (v3436(VarCurr)<->$T)).
% 93.85/93.26  all VarCurr (v3437(VarCurr)<->v3438(VarCurr)|v3442(VarCurr)).
% 93.85/93.26  all VarCurr (v3442(VarCurr)<->v31(VarCurr,bitIndex9)&v3443(VarCurr)).
% 93.85/93.26  all VarCurr (-v3443(VarCurr)<->v36(VarCurr,bitIndex9)).
% 93.85/93.26  all VarCurr (v3438(VarCurr)<->v3439(VarCurr)|v3420(VarCurr)).
% 93.85/93.26  all VarCurr (v3439(VarCurr)<->v3440(VarCurr)|v3415(VarCurr)).
% 93.85/93.26  all VarCurr (v3440(VarCurr)<->v3441(VarCurr)|v879(VarCurr)).
% 93.85/93.26  all VarCurr (v3441(VarCurr)<->v85(VarCurr)|v3410(VarCurr)).
% 93.85/93.26  all VarCurr (-v3435(VarCurr)<->v33(VarCurr)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3426(VarNext)<->v3427(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3427(VarNext)<->v3428(VarNext)&v1240(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3428(VarNext)<->v1247(VarNext))).
% 93.85/93.26  all VarCurr (v3420(VarCurr)<->v31(VarCurr,bitIndex8)&v3422(VarCurr)).
% 93.85/93.26  all VarCurr (-v3422(VarCurr)<->v3423(VarCurr)).
% 93.85/93.26  all VarCurr (v3423(VarCurr)<->v36(VarCurr,bitIndex8)|v36(VarCurr,bitIndex9)).
% 93.85/93.26  all VarCurr (v3415(VarCurr)<->v31(VarCurr,bitIndex5)&v3417(VarCurr)).
% 93.85/93.26  all VarCurr (-v3417(VarCurr)<->v3418(VarCurr)).
% 93.85/93.26  all VarCurr (v3418(VarCurr)<->v36(VarCurr,bitIndex5)|v36(VarCurr,bitIndex9)).
% 93.85/93.26  all VarCurr (v3410(VarCurr)<->v31(VarCurr,bitIndex2)&v3412(VarCurr)).
% 93.85/93.26  all VarCurr (-v3412(VarCurr)<->v3413(VarCurr)).
% 93.85/93.26  all VarCurr (v3413(VarCurr)<->v36(VarCurr,bitIndex2)|v36(VarCurr,bitIndex9)).
% 93.85/93.26  all VarCurr (v85(VarCurr)<->v36(VarCurr,bitIndex3)).
% 93.85/93.26  all VarCurr (-v3398(VarCurr)-> (v36(VarCurr,bitIndex3)<->$F)).
% 93.85/93.26  all VarCurr (v3398(VarCurr)-> (v36(VarCurr,bitIndex3)<->$T)).
% 93.85/93.26  all VarCurr (v3398(VarCurr)<->v3399(VarCurr)|v3407(VarCurr)).
% 93.85/93.26  all VarCurr (v3407(VarCurr)<->v3408(VarCurr)&v3348(VarCurr)).
% 93.85/93.26  all VarCurr (-v3408(VarCurr)<->v38(VarCurr)).
% 93.85/93.26  all VarCurr (v3399(VarCurr)<->v3400(VarCurr)|v3405(VarCurr)).
% 93.85/93.26  all VarCurr (v3405(VarCurr)<->v3406(VarCurr)&v1360(VarCurr)).
% 93.85/93.26  all VarCurr (v3406(VarCurr)<->v3346(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3400(VarCurr)<->v3401(VarCurr)|v3403(VarCurr)).
% 93.85/93.26  all VarCurr (v3403(VarCurr)<->v3404(VarCurr)&v1355(VarCurr)).
% 93.85/93.26  all VarCurr (v3404(VarCurr)<->v3346(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3401(VarCurr)<->v3402(VarCurr)&v1348(VarCurr)).
% 93.85/93.26  all VarCurr (v3402(VarCurr)<->v3346(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v87(VarCurr)<->v89(VarCurr)).
% 93.85/93.26  all VarCurr (v89(VarCurr)<->v91(VarCurr,bitIndex0)).
% 93.85/93.26  all VarCurr (v91(VarCurr,bitIndex0)<->v898(VarCurr,bitIndex0)).
% 93.85/93.26  all VarCurr (v892(VarCurr,bitIndex0)<->v896(VarCurr,bitIndex0)).
% 93.85/93.26  all VarCurr (v885(VarCurr,bitIndex0)<->v889(VarCurr,bitIndex0)).
% 93.85/93.26  all VarCurr (-v93(VarCurr)<->v3396(VarCurr)).
% 93.85/93.26  all VarCurr (v3396(VarCurr)<->v3358(VarCurr)|v95(VarCurr,bitIndex2)).
% 93.85/93.26  all VarCurr B (range_2_0(B)-> (v95(VarCurr,B)<->v97(VarCurr,B)&v3309(VarCurr,B))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3371(VarNext)-> (all B (range_2_0(B)-> (v3309(VarNext,B)<->v3309(VarCurr,B)))))).
% 93.85/93.26  all VarNext (v3371(VarNext)-> (all B (range_2_0(B)-> (v3309(VarNext,B)<->v3390(VarNext,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_2_0(B)-> (v3390(VarNext,B)<->v3388(VarCurr,B))))).
% 93.85/93.26  all VarCurr (-v3382(VarCurr)-> (all B (range_2_0(B)-> (v3388(VarCurr,B)<->v3391(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3382(VarCurr)-> (all B (range_2_0(B)-> (v3388(VarCurr,B)<->$T)))).
% 93.85/93.26  all VarCurr (-v3313(VarCurr)-> (all B (range_2_0(B)-> (v3391(VarCurr,B)<->v887(VarCurr,B))))).
% 93.85/93.26  all VarCurr (v3313(VarCurr)-> (all B (range_2_0(B)-> (v3391(VarCurr,B)<->v894(VarCurr,B))))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3371(VarNext)<->v3372(VarNext)&v3381(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3381(VarNext)<->v3379(VarCurr))).
% 93.85/93.26  all VarCurr (v3379(VarCurr)<->v3382(VarCurr)|v3383(VarCurr)).
% 93.85/93.26  all VarCurr (v3383(VarCurr)<->v3384(VarCurr)&v3387(VarCurr)).
% 93.85/93.26  all VarCurr (-v3387(VarCurr)<->v3382(VarCurr)).
% 93.85/93.26  all VarCurr (v3384(VarCurr)<->v3313(VarCurr)|v3385(VarCurr)).
% 93.85/93.26  all VarCurr (v3385(VarCurr)<->v3361(VarCurr)&v3386(VarCurr)).
% 93.85/93.26  all VarCurr (-v3386(VarCurr)<->v3313(VarCurr)).
% 93.85/93.26  all VarCurr (-v3382(VarCurr)<->v3311(VarCurr)).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3372(VarNext)<->v3373(VarNext)&v3368(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3373(VarNext)<->v3375(VarNext))).
% 93.85/93.26  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3375(VarNext)<->v3368(VarCurr))).
% 93.85/93.26  all B (range_2_0(B)-> (v3309(constB0,B)<->$T)).
% 93.85/93.26  all VarCurr (v3368(VarCurr)<->v288(VarCurr)).
% 93.85/93.26  all VarCurr (v3361(VarCurr)<->v3363(VarCurr)&v3366(VarCurr)).
% 93.85/93.26  all VarCurr (-v3366(VarCurr)<->v3315(VarCurr)).
% 93.85/93.26  all VarCurr (v3363(VarCurr)<->v3365(VarCurr)|v97(VarCurr,bitIndex2)).
% 93.85/93.26  all VarCurr (v3365(VarCurr)<->v97(VarCurr,bitIndex0)|v97(VarCurr,bitIndex1)).
% 93.85/93.26  all VarCurr (v3313(VarCurr)<->v3356(VarCurr)&v3359(VarCurr)).
% 93.85/93.26  all VarCurr (-v3359(VarCurr)<->v3315(VarCurr)).
% 93.85/93.26  all VarCurr (v3356(VarCurr)<->v3358(VarCurr)|v95(VarCurr,bitIndex2)).
% 93.85/93.26  all VarCurr (v3358(VarCurr)<->v95(VarCurr,bitIndex0)|v95(VarCurr,bitIndex1)).
% 93.85/93.26  all VarCurr (v3315(VarCurr)<->v3317(VarCurr)).
% 93.85/93.26  all VarCurr (v3317(VarCurr)<->v3319(VarCurr)).
% 93.85/93.26  all VarCurr (v3319(VarCurr)<->v3350(VarCurr)|v38(VarCurr)).
% 93.85/93.26  all VarCurr (v3350(VarCurr)<->v3351(VarCurr)|v36(VarCurr,bitIndex11)).
% 93.85/93.26  all VarCurr (v3351(VarCurr)<->v3352(VarCurr)|v36(VarCurr,bitIndex10)).
% 93.85/93.26  all VarCurr (v3352(VarCurr)<->v3353(VarCurr)|v36(VarCurr,bitIndex9)).
% 93.85/93.26  all VarCurr (v3353(VarCurr)<->v3354(VarCurr)|v36(VarCurr,bitIndex8)).
% 93.85/93.26  all VarCurr (v3354(VarCurr)<->v36(VarCurr,bitIndex2)|v36(VarCurr,bitIndex5)).
% 93.85/93.26  all VarCurr (-v3331(VarCurr)-> (v36(VarCurr,bitIndex10)<->$F)).
% 93.85/93.26  all VarCurr (v3331(VarCurr)-> (v36(VarCurr,bitIndex10)<->$T)).
% 93.85/93.26  all VarCurr (v3331(VarCurr)<->v3332(VarCurr)|v3347(VarCurr)).
% 93.85/93.26  all VarCurr (v3347(VarCurr)<->v38(VarCurr)&v3348(VarCurr)).
% 93.85/93.26  all VarCurr (v3348(VarCurr)<-> ($T<->v31(VarCurr,bitIndex10))).
% 93.85/93.26  all VarCurr (v3332(VarCurr)<->v3333(VarCurr)|v3343(VarCurr)).
% 93.85/93.26  all VarCurr (v3343(VarCurr)<->v3344(VarCurr)&v1323(VarCurr)).
% 93.85/93.26  all VarCurr (v3344(VarCurr)<->v3346(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3346(VarCurr)<->v1678(VarCurr)&v1162(VarCurr)).
% 93.85/93.26  all VarCurr (v3333(VarCurr)<->v3334(VarCurr)|v3341(VarCurr)).
% 93.85/93.26  all VarCurr (v3341(VarCurr)<->v3342(VarCurr)&v1300(VarCurr)).
% 93.85/93.26  all VarCurr (v3342(VarCurr)<->v3338(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3334(VarCurr)<->v3335(VarCurr)|v3339(VarCurr)).
% 93.85/93.26  all VarCurr (v3339(VarCurr)<->v3340(VarCurr)&v1278(VarCurr)).
% 93.85/93.26  all VarCurr (v3340(VarCurr)<->v3338(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3335(VarCurr)<->v3336(VarCurr)&v1238(VarCurr)).
% 93.85/93.26  all VarCurr (v3336(VarCurr)<->v3338(VarCurr)&v1682(VarCurr)).
% 93.85/93.26  all VarCurr (v3338(VarCurr)<->v1690(VarCurr)&v1162(VarCurr)).
% 93.85/93.26  all VarNext (v31(VarNext,bitIndex10)<->v3323(VarNext,bitIndex9)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3324(VarNext)-> (v3323(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v3323(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v3323(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v3323(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v3323(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v3323(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v3323(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v3323(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v3323(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v3323(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v3323(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 93.85/93.27  all VarNext (v3324(VarNext)-> (all B (range_10_0(B)-> (v3323(VarNext,B)<->v1253(VarNext,B))))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3324(VarNext)<->v3325(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3325(VarNext)<->v3327(VarNext)&v1240(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3327(VarNext)<->v1247(VarNext))).
% 93.85/93.27  all VarCurr (v3311(VarCurr)<->v12(VarCurr)).
% 93.85/93.27  all VarCurr (v97(VarCurr,bitIndex0)<->v3301(VarCurr)).
% 93.85/93.27  all VarCurr (v97(VarCurr,bitIndex1)<->v308(VarCurr)).
% 93.85/93.27  all VarCurr (v97(VarCurr,bitIndex2)<->v99(VarCurr)).
% 93.85/93.27  all VarCurr (v3301(VarCurr)<->v3303(VarCurr)).
% 93.85/93.27  all VarCurr (v3303(VarCurr)<->v3305(VarCurr)&v3306(VarCurr)).
% 93.85/93.27  all VarCurr (v3306(VarCurr)<->v1162(VarCurr)|v907(VarCurr)).
% 93.85/93.27  all VarCurr (-v3305(VarCurr)<->v1031(VarCurr)).
% 93.85/93.27  all VarCurr (v308(VarCurr)<->v310(VarCurr)).
% 93.85/93.27  all VarCurr (-v310(VarCurr)<->v312(VarCurr)).
% 93.85/93.27  all VarCurr (v312(VarCurr)<->v314(VarCurr)).
% 93.85/93.27  all VarCurr (v314(VarCurr)<->v316(VarCurr)|v3201(VarCurr)).
% 93.85/93.27  all VarCurr (v3201(VarCurr)<->v3203(VarCurr)).
% 93.85/93.27  all VarCurr (v3203(VarCurr)<-> (v3205(VarCurr,bitIndex4)<->$F)& (v3205(VarCurr,bitIndex3)<->$F)& (v3205(VarCurr,bitIndex2)<->$F)& (v3205(VarCurr,bitIndex1)<->$F)& (v3205(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3285(VarNext)-> (all B (range_4_0(B)-> (v3205(VarNext,B)<->v3205(VarCurr,B)))))).
% 93.85/93.27  all VarNext (v3285(VarNext)-> (all B (range_4_0(B)-> (v3205(VarNext,B)<->v3293(VarNext,B))))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v3293(VarNext,B)<->v3291(VarCurr,B))))).
% 93.85/93.27  all VarCurr (-v3294(VarCurr)-> (all B (range_4_0(B)-> (v3291(VarCurr,B)<->v3207(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3294(VarCurr)-> (all B (range_4_0(B)-> (v3291(VarCurr,B)<->$F)))).
% 93.85/93.27  all VarCurr (-v3294(VarCurr)<->v754(VarCurr)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3285(VarNext)<->v3286(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3286(VarNext)<->v3287(VarNext)&v751(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3287(VarNext)<->v823(VarNext))).
% 93.85/93.27  all VarCurr (-v3209(VarCurr)& -v3211(VarCurr)& -v3252(VarCurr)-> (all B (range_4_0(B)-> (v3207(VarCurr,B)<->v3205(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3252(VarCurr)-> (all B (range_4_0(B)-> (v3207(VarCurr,B)<->v3254(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3211(VarCurr)-> (all B (range_4_0(B)-> (v3207(VarCurr,B)<->v3213(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3209(VarCurr)-> (all B (range_4_0(B)-> (v3207(VarCurr,B)<->v3205(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3281(VarCurr)<-> (v3282(VarCurr,bitIndex1)<->$T)& (v3282(VarCurr,bitIndex0)<->$T)).
% 93.85/93.27  all VarCurr (v3282(VarCurr,bitIndex0)<->v873(VarCurr)).
% 93.85/93.27  all VarCurr (v3282(VarCurr,bitIndex1)<->v783(VarCurr)).
% 93.85/93.27  all VarCurr (-v3255(VarCurr)-> (all B (range_4_0(B)-> (v3254(VarCurr,B)<->v3256(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3255(VarCurr)-> (all B (range_4_0(B)-> (v3254(VarCurr,B)<->b10000(B))))).
% 93.85/93.27  all VarCurr (v3256(VarCurr,bitIndex0)<->v3278(VarCurr)).
% 93.85/93.27  all VarCurr (v3256(VarCurr,bitIndex1)<->v3276(VarCurr)).
% 93.85/93.27  all VarCurr (v3256(VarCurr,bitIndex2)<->v3271(VarCurr)).
% 93.85/93.27  all VarCurr (v3256(VarCurr,bitIndex3)<->v3266(VarCurr)).
% 93.85/93.27  all VarCurr (v3256(VarCurr,bitIndex4)<->v3258(VarCurr)).
% 93.85/93.27  all VarCurr (v3276(VarCurr)<->v3277(VarCurr)&v3280(VarCurr)).
% 93.85/93.27  all VarCurr (v3280(VarCurr)<->v3205(VarCurr,bitIndex0)|v3205(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v3277(VarCurr)<->v3278(VarCurr)|v3279(VarCurr)).
% 93.85/93.27  all VarCurr (-v3279(VarCurr)<->v3205(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (-v3278(VarCurr)<->v3205(VarCurr,bitIndex0)).
% 93.85/93.27  all VarCurr (v3271(VarCurr)<->v3272(VarCurr)&v3275(VarCurr)).
% 93.85/93.27  all VarCurr (v3275(VarCurr)<->v3263(VarCurr)|v3205(VarCurr,bitIndex2)).
% 93.85/93.27  all VarCurr (v3272(VarCurr)<->v3273(VarCurr)|v3274(VarCurr)).
% 93.85/93.27  all VarCurr (-v3274(VarCurr)<->v3205(VarCurr,bitIndex2)).
% 93.85/93.27  all VarCurr (-v3273(VarCurr)<->v3263(VarCurr)).
% 93.85/93.27  all VarCurr (v3266(VarCurr)<->v3267(VarCurr)&v3270(VarCurr)).
% 93.85/93.27  all VarCurr (v3270(VarCurr)<->v3262(VarCurr)|v3205(VarCurr,bitIndex3)).
% 93.85/93.27  all VarCurr (v3267(VarCurr)<->v3268(VarCurr)|v3269(VarCurr)).
% 93.85/93.27  all VarCurr (-v3269(VarCurr)<->v3205(VarCurr,bitIndex3)).
% 93.85/93.27  all VarCurr (-v3268(VarCurr)<->v3262(VarCurr)).
% 93.85/93.27  all VarCurr (v3258(VarCurr)<->v3259(VarCurr)&v3265(VarCurr)).
% 93.85/93.27  all VarCurr (v3265(VarCurr)<->v3261(VarCurr)|v3205(VarCurr,bitIndex4)).
% 93.85/93.27  all VarCurr (v3259(VarCurr)<->v3260(VarCurr)|v3264(VarCurr)).
% 93.85/93.27  all VarCurr (-v3264(VarCurr)<->v3205(VarCurr,bitIndex4)).
% 93.85/93.27  all VarCurr (-v3260(VarCurr)<->v3261(VarCurr)).
% 93.85/93.27  all VarCurr (v3261(VarCurr)<->v3262(VarCurr)&v3205(VarCurr,bitIndex3)).
% 93.85/93.27  all VarCurr (v3262(VarCurr)<->v3263(VarCurr)&v3205(VarCurr,bitIndex2)).
% 93.85/93.27  all VarCurr (v3263(VarCurr)<->v3205(VarCurr,bitIndex0)&v3205(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v3255(VarCurr)<-> (v3205(VarCurr,bitIndex4)<->$T)& (v3205(VarCurr,bitIndex3)<->$F)& (v3205(VarCurr,bitIndex2)<->$F)& (v3205(VarCurr,bitIndex1)<->$F)& (v3205(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3252(VarCurr)<-> (v3253(VarCurr,bitIndex1)<->$T)& (v3253(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3253(VarCurr,bitIndex0)<->v873(VarCurr)).
% 93.85/93.27  all VarCurr (v3253(VarCurr,bitIndex1)<->v783(VarCurr)).
% 93.85/93.27  all VarCurr (-v3214(VarCurr)-> (all B (range_31_0(B)-> (v3213(VarCurr,B)<->v3215(VarCurr,B))))).
% 93.85/93.27  all VarCurr (v3214(VarCurr)-> (all B (range_31_0(B)-> (v3213(VarCurr,B)<->$F)))).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex6)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex7)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex8)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex9)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex10)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex11)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex12)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex13)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex14)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex15)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex16)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex17)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex18)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex19)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex20)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex21)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex22)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex23)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex24)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex25)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex26)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex27)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex28)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex29)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex30)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3215(VarCurr,bitIndex31)<->v3216(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr B (range_5_0(B)-> (v3215(VarCurr,B)<->v3216(VarCurr,B))).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex0)<->v3250(VarCurr)).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex1)<->v3248(VarCurr)).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex2)<->v3244(VarCurr)).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex3)<->v3240(VarCurr)).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex4)<->v3236(VarCurr)).
% 93.85/93.27  all VarCurr (v3216(VarCurr,bitIndex5)<->v3218(VarCurr)).
% 93.85/93.27  all VarCurr (v3248(VarCurr)<->v3249(VarCurr)&v3251(VarCurr)).
% 93.85/93.27  all VarCurr (v3251(VarCurr)<->v3222(VarCurr,bitIndex0)|v3230(VarCurr)).
% 93.85/93.27  all VarCurr (v3249(VarCurr)<->v3250(VarCurr)|v3222(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (-v3250(VarCurr)<->v3222(VarCurr,bitIndex0)).
% 93.85/93.27  all VarCurr (v3244(VarCurr)<->v3245(VarCurr)&v3247(VarCurr)).
% 93.85/93.27  all VarCurr (v3247(VarCurr)<->v3228(VarCurr)|v3231(VarCurr)).
% 93.85/93.27  all VarCurr (v3245(VarCurr)<->v3246(VarCurr)|v3222(VarCurr,bitIndex2)).
% 93.85/93.27  all VarCurr (-v3246(VarCurr)<->v3228(VarCurr)).
% 93.85/93.27  all VarCurr (v3240(VarCurr)<->v3241(VarCurr)&v3243(VarCurr)).
% 93.85/93.27  all VarCurr (v3243(VarCurr)<->v3226(VarCurr)|v3232(VarCurr)).
% 93.85/93.27  all VarCurr (v3241(VarCurr)<->v3242(VarCurr)|v3222(VarCurr,bitIndex3)).
% 93.85/93.27  all VarCurr (-v3242(VarCurr)<->v3226(VarCurr)).
% 93.85/93.27  all VarCurr (v3236(VarCurr)<->v3237(VarCurr)&v3239(VarCurr)).
% 93.85/93.27  all VarCurr (v3239(VarCurr)<->v3224(VarCurr)|v3233(VarCurr)).
% 93.85/93.27  all VarCurr (v3237(VarCurr)<->v3238(VarCurr)|v3222(VarCurr,bitIndex4)).
% 93.85/93.27  all VarCurr (-v3238(VarCurr)<->v3224(VarCurr)).
% 93.85/93.27  all VarCurr (v3218(VarCurr)<->v3219(VarCurr)&v3234(VarCurr)).
% 93.85/93.27  all VarCurr (v3234(VarCurr)<->v3221(VarCurr)|v3235(VarCurr)).
% 93.85/93.27  all VarCurr (-v3235(VarCurr)<->v3222(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (v3219(VarCurr)<->v3220(VarCurr)|v3222(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr (-v3220(VarCurr)<->v3221(VarCurr)).
% 93.85/93.27  all VarCurr (v3221(VarCurr)<->v3222(VarCurr,bitIndex4)|v3223(VarCurr)).
% 93.85/93.27  all VarCurr (v3223(VarCurr)<->v3224(VarCurr)&v3233(VarCurr)).
% 93.85/93.27  all VarCurr (-v3233(VarCurr)<->v3222(VarCurr,bitIndex4)).
% 93.85/93.27  all VarCurr (v3224(VarCurr)<->v3222(VarCurr,bitIndex3)|v3225(VarCurr)).
% 93.85/93.27  all VarCurr (v3225(VarCurr)<->v3226(VarCurr)&v3232(VarCurr)).
% 93.85/93.27  all VarCurr (-v3232(VarCurr)<->v3222(VarCurr,bitIndex3)).
% 93.85/93.27  all VarCurr (v3226(VarCurr)<->v3222(VarCurr,bitIndex2)|v3227(VarCurr)).
% 93.85/93.27  all VarCurr (v3227(VarCurr)<->v3228(VarCurr)&v3231(VarCurr)).
% 93.85/93.27  all VarCurr (-v3231(VarCurr)<->v3222(VarCurr,bitIndex2)).
% 93.85/93.27  all VarCurr (v3228(VarCurr)<->v3222(VarCurr,bitIndex1)|v3229(VarCurr)).
% 93.85/93.27  all VarCurr (v3229(VarCurr)<->v3222(VarCurr,bitIndex0)&v3230(VarCurr)).
% 93.85/93.27  all VarCurr (-v3230(VarCurr)<->v3222(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (-v3222(VarCurr,bitIndex5)).
% 93.85/93.27  all VarCurr B (range_4_0(B)-> (v3222(VarCurr,B)<->v3205(VarCurr,B))).
% 93.85/93.27  all VarCurr (v3214(VarCurr)<-> (v3205(VarCurr,bitIndex4)<->$F)& (v3205(VarCurr,bitIndex3)<->$F)& (v3205(VarCurr,bitIndex2)<->$F)& (v3205(VarCurr,bitIndex1)<->$F)& (v3205(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3211(VarCurr)<-> (v3212(VarCurr,bitIndex1)<->$F)& (v3212(VarCurr,bitIndex0)<->$T)).
% 93.85/93.27  all VarCurr (v3212(VarCurr,bitIndex0)<->v873(VarCurr)).
% 93.85/93.27  all VarCurr (v3212(VarCurr,bitIndex1)<->v783(VarCurr)).
% 93.85/93.27  -v3205(constB0,bitIndex4).
% 93.85/93.27  -v3205(constB0,bitIndex3).
% 93.85/93.27  -v3205(constB0,bitIndex2).
% 93.85/93.27  -v3205(constB0,bitIndex1).
% 93.85/93.27  v3205(constB0,bitIndex0).
% 93.85/93.27  all VarCurr (v3209(VarCurr)<-> (v3210(VarCurr,bitIndex1)<->$F)& (v3210(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3210(VarCurr,bitIndex0)<->v873(VarCurr)).
% 93.85/93.27  all VarCurr (v3210(VarCurr,bitIndex1)<->v783(VarCurr)).
% 93.85/93.27  all VarCurr (v316(VarCurr)<->v3195(VarCurr)|v3199(VarCurr)).
% 93.85/93.27  all VarCurr (v3199(VarCurr)<->v3103(VarCurr)&v3109(VarCurr)).
% 93.85/93.27  all VarCurr (v3195(VarCurr)<->v3196(VarCurr)|v2259(VarCurr)).
% 93.85/93.27  all VarCurr (v3196(VarCurr)<->v3197(VarCurr)&v3198(VarCurr)).
% 93.85/93.27  all VarCurr (-v3198(VarCurr)<->v1908(VarCurr)).
% 93.85/93.27  all VarCurr (v3197(VarCurr)<->v318(VarCurr)&v664(VarCurr)).
% 93.85/93.27  all VarCurr (v3109(VarCurr)<->v3111(VarCurr)).
% 93.85/93.27  all VarCurr (v3111(VarCurr)<->v3113(VarCurr)).
% 93.85/93.27  all VarCurr (v3113(VarCurr)<->v3115(VarCurr)).
% 93.85/93.27  all VarCurr (v3115(VarCurr)<->v3117(VarCurr)).
% 93.85/93.27  all VarCurr (v3117(VarCurr)<->v1918(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1918(VarCurr,bitIndex1)<->v1920(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1920(VarCurr,bitIndex1)<->v1922(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1922(VarCurr,bitIndex1)<->v1924(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1924(VarCurr,bitIndex1)<->v1926(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1926(VarCurr,bitIndex1)<->v1928(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1928(VarCurr,bitIndex1)<->v3119(VarCurr)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3150(VarNext)-> (v3119(VarNext)<->v3119(VarCurr)))).
% 93.85/93.27  all VarNext (v3150(VarNext)-> (v3119(VarNext)<->v3185(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3185(VarNext)<->v3183(VarCurr))).
% 93.85/93.27  all VarCurr (-v3121(VarCurr)-> (v3183(VarCurr)<->v3186(VarCurr))).
% 93.85/93.27  all VarCurr (v3121(VarCurr)-> (v3183(VarCurr)<->v3123(VarCurr))).
% 93.85/93.27  all VarCurr (-v3163(VarCurr)-> (v3186(VarCurr)<->v3145(VarCurr))).
% 93.85/93.27  all VarCurr (v3163(VarCurr)-> (v3186(VarCurr)<->v3187(VarCurr))).
% 93.85/93.27  all VarCurr (-v3166(VarCurr)& -v3168(VarCurr)-> (v3187(VarCurr)<->v3191(VarCurr))).
% 93.85/93.27  all VarCurr (v3168(VarCurr)-> (v3187(VarCurr)<->v3190(VarCurr))).
% 93.85/93.27  all VarCurr (v3166(VarCurr)-> (v3187(VarCurr)<->v3188(VarCurr))).
% 93.85/93.27  all VarCurr (-v3176(VarCurr)-> (v3191(VarCurr)<->v3145(VarCurr))).
% 93.85/93.27  all VarCurr (v3176(VarCurr)-> (v3191(VarCurr)<->$T)).
% 93.85/93.27  all VarCurr (-v3170(VarCurr)-> (v3190(VarCurr)<->v3145(VarCurr))).
% 93.85/93.27  all VarCurr (v3170(VarCurr)-> (v3190(VarCurr)<->$F)).
% 93.85/93.27  all VarCurr (-v3189(VarCurr)-> (v3188(VarCurr)<->$F)).
% 93.85/93.27  all VarCurr (v3189(VarCurr)-> (v3188(VarCurr)<->$T)).
% 93.85/93.27  all VarCurr (v3189(VarCurr)<-> (v3131(VarCurr)<->$T)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3150(VarNext)<->v3151(VarNext)&v3160(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3160(VarNext)<->v3158(VarCurr))).
% 93.85/93.27  all VarCurr (v3158(VarCurr)<->v3121(VarCurr)|v3161(VarCurr)).
% 93.85/93.27  all VarCurr (v3161(VarCurr)<->v3162(VarCurr)&v3182(VarCurr)).
% 93.85/93.27  all VarCurr (-v3182(VarCurr)<->v3121(VarCurr)).
% 93.85/93.27  all VarCurr (v3162(VarCurr)<->v3163(VarCurr)|v3180(VarCurr)).
% 93.85/93.27  all VarCurr (v3180(VarCurr)<->v3141(VarCurr)&v3181(VarCurr)).
% 93.85/93.27  all VarCurr (-v3181(VarCurr)<->v3143(VarCurr)).
% 93.85/93.27  all VarCurr (v3163(VarCurr)<->v3164(VarCurr)&v3143(VarCurr)).
% 93.85/93.27  all VarCurr (v3164(VarCurr)<->v3165(VarCurr)|v3174(VarCurr)).
% 93.85/93.27  all VarCurr (v3174(VarCurr)<->v3175(VarCurr)&v3179(VarCurr)).
% 93.85/93.27  all VarCurr (v3179(VarCurr)<-> (v3167(VarCurr,bitIndex2)<->$F)& (v3167(VarCurr,bitIndex1)<->$F)& (v3167(VarCurr,bitIndex0)<->$T)).
% 93.85/93.27  all VarCurr (v3175(VarCurr)<->v3176(VarCurr)|v3177(VarCurr)).
% 93.85/93.27  all VarCurr (v3177(VarCurr)<->v3141(VarCurr)&v3178(VarCurr)).
% 93.85/93.27  all VarCurr (-v3178(VarCurr)<->v3176(VarCurr)).
% 93.85/93.27  all VarCurr (v3176(VarCurr)<-> (v3131(VarCurr)<->$T)).
% 93.85/93.27  all VarCurr (v3165(VarCurr)<->v3166(VarCurr)|v3168(VarCurr)).
% 93.85/93.27  all VarCurr (v3168(VarCurr)<->v3169(VarCurr)&v3173(VarCurr)).
% 93.85/93.27  all VarCurr (v3173(VarCurr)<-> (v3167(VarCurr,bitIndex2)<->$F)& (v3167(VarCurr,bitIndex1)<->$T)& (v3167(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3169(VarCurr)<->v3170(VarCurr)|v3171(VarCurr)).
% 93.85/93.27  all VarCurr (v3171(VarCurr)<->v3141(VarCurr)&v3172(VarCurr)).
% 93.85/93.27  all VarCurr (-v3172(VarCurr)<->v3170(VarCurr)).
% 93.85/93.27  all VarCurr (v3170(VarCurr)<-> (v3131(VarCurr)<->$T)).
% 93.85/93.27  all VarCurr (v3166(VarCurr)<-> (v3167(VarCurr,bitIndex2)<->$T)& (v3167(VarCurr,bitIndex1)<->$F)& (v3167(VarCurr,bitIndex0)<->$F)).
% 93.85/93.27  all VarCurr (v3167(VarCurr,bitIndex0)<->v3129(VarCurr)).
% 93.85/93.27  all VarCurr (v3167(VarCurr,bitIndex1)<->v3127(VarCurr)).
% 93.85/93.27  all VarCurr (v3167(VarCurr,bitIndex2)<->v3125(VarCurr)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3151(VarNext)<->v3152(VarNext)&v3147(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3152(VarNext)<->v3154(VarNext))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3154(VarNext)<->v3147(VarCurr))).
% 93.85/93.27  all VarCurr (v3147(VarCurr)<->v2207(VarCurr)).
% 93.85/93.27  all VarCurr (v3145(VarCurr)<->$F).
% 93.85/93.27  all VarCurr (v3143(VarCurr)<->v2045(VarCurr)).
% 93.85/93.27  all VarCurr (v3141(VarCurr)<->$F).
% 93.85/93.27  all VarCurr (v3131(VarCurr)<->v1966(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1966(VarCurr,bitIndex1)<->v1968(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1968(VarCurr,bitIndex1)<->v1970(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1970(VarCurr,bitIndex1)<->v1972(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1972(VarCurr,bitIndex1)<->v1974(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1974(VarCurr,bitIndex1)<->v1976(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1976(VarCurr,bitIndex1)<->v1978(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1978(VarCurr,bitIndex1)<->v1980(VarCurr,bitIndex1)).
% 93.85/93.27  all VarCurr (v1980(VarCurr,bitIndex1)<->v1982(VarCurr,bitIndex1)).
% 93.85/93.27  all VarNext (v1982(VarNext,bitIndex1)<->v3133(VarNext,bitIndex1)).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3134(VarNext)-> (all B (range_63_0(B)-> (v3133(VarNext,B)<->v1982(VarCurr,B)))))).
% 93.85/93.27  all VarNext (v3134(VarNext)-> (all B (range_63_0(B)-> (v3133(VarNext,B)<->v2034(VarNext,B))))).
% 93.85/93.27  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3134(VarNext)<->v3135(VarNext))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3135(VarNext)<->v3137(VarNext)&v2013(VarNext))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3137(VarNext)<->v2028(VarNext))).
% 93.85/93.28  all VarCurr (v1987(VarCurr,bitIndex1)<->v1989(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1989(VarCurr,bitIndex1)<->v1991(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1991(VarCurr,bitIndex1)<->v1993(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1993(VarCurr,bitIndex1)<->v1995(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1995(VarCurr,bitIndex1)<->v1997(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1997(VarCurr,bitIndex1)<->v1999(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v1999(VarCurr,bitIndex1)<->v2001(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v2001(VarCurr,bitIndex1)<->v2003(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v2003(VarCurr,bitIndex1)<->v2005(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v2005(VarCurr,bitIndex1)<->v2007(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v2007(VarCurr,bitIndex1)<->v2009(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v3129(VarCurr)<->$F).
% 93.85/93.28  all VarCurr (v3127(VarCurr)<->$F).
% 93.85/93.28  all VarCurr (v3125(VarCurr)<->$T).
% 93.85/93.28  all VarCurr (v3123(VarCurr)<->$F).
% 93.85/93.28  all VarCurr (v3121(VarCurr)<->v1934(VarCurr)).
% 93.85/93.28  all VarCurr (v3103(VarCurr)<->v3105(VarCurr)).
% 93.85/93.28  all VarCurr (v3105(VarCurr)<->v3107(VarCurr)).
% 93.85/93.28  all VarCurr (v2259(VarCurr)<->v3094(VarCurr)&v1908(VarCurr)).
% 93.85/93.28  all VarCurr (v3094(VarCurr)<->v3095(VarCurr)|v3098(VarCurr)).
% 93.85/93.28  all VarCurr (v3098(VarCurr)<->v3099(VarCurr)&v3100(VarCurr)).
% 93.85/93.28  all VarCurr (v3100(VarCurr)<-> (v3101(VarCurr,bitIndex4)<->$T)& (v3101(VarCurr,bitIndex3)<->$T)& (v3101(VarCurr,bitIndex2)<->$T)& (v3101(VarCurr,bitIndex1)<->$T)& (v3101(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  all VarCurr (v3101(VarCurr,bitIndex0)<->v3054(VarCurr)).
% 93.85/93.28  all VarCurr (v3101(VarCurr,bitIndex1)<->v3049(VarCurr)).
% 93.85/93.28  all VarCurr (v3101(VarCurr,bitIndex2)<->v3044(VarCurr)).
% 93.85/93.28  all VarCurr (v3101(VarCurr,bitIndex3)<->v3039(VarCurr)).
% 93.85/93.28  all VarCurr (v3101(VarCurr,bitIndex4)<->v3012(VarCurr)).
% 93.85/93.28  all VarCurr (v3099(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarCurr (v3095(VarCurr)<->v3096(VarCurr)|v3097(VarCurr)).
% 93.85/93.28  all VarCurr (v3097(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  all VarCurr (v3096(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$F)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3079(VarNext)-> (all B (range_1_0(B)-> (v2261(VarNext,B)<->v2261(VarCurr,B)))))).
% 93.85/93.28  all VarNext (v3079(VarNext)-> (all B (range_1_0(B)-> (v2261(VarNext,B)<->v3087(VarNext,B))))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_1_0(B)-> (v3087(VarNext,B)<->v3085(VarCurr,B))))).
% 93.85/93.28  all VarCurr (-v3088(VarCurr)-> (all B (range_1_0(B)-> (v3085(VarCurr,B)<->v2263(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v3088(VarCurr)-> (all B (range_1_0(B)-> (v3085(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (v3088(VarCurr)<->v3089(VarCurr)|v3090(VarCurr)).
% 93.85/93.28  all VarCurr (-v3090(VarCurr)<->v1908(VarCurr)).
% 93.85/93.28  all VarCurr (-v3089(VarCurr)<->v12(VarCurr)).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3079(VarNext)<->v3080(VarNext))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v3080(VarNext)<->v3081(VarNext)&v288(VarNext))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v3081(VarNext)<->v1891(VarNext))).
% 93.85/93.28  all VarCurr (-v2988(VarCurr)& -v2992(VarCurr)& -v3004(VarCurr)-> (all B (range_1_0(B)-> (v2263(VarCurr,B)<->v3058(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v3004(VarCurr)-> (all B (range_1_0(B)-> (v2263(VarCurr,B)<->v3005(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2992(VarCurr)-> (all B (range_1_0(B)-> (v2263(VarCurr,B)<->v2993(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2988(VarCurr)-> (all B (range_1_0(B)-> (v2263(VarCurr,B)<->v2989(VarCurr,B))))).
% 93.85/93.28  all VarCurr (-v741(VarCurr)-> (all B (range_1_0(B)-> (v3058(VarCurr,B)<->v3059(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v741(VarCurr)-> (all B (range_1_0(B)-> (v3058(VarCurr,B)<->b01(B))))).
% 93.85/93.28  all VarCurr (-v3060(VarCurr)-> (all B (range_1_0(B)-> (v3059(VarCurr,B)<->v3061(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v3060(VarCurr)-> (all B (range_1_0(B)-> (v3059(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (-v3062(VarCurr)-> (all B (range_1_0(B)-> (v3061(VarCurr,B)<->$T)))).
% 93.85/93.28  all VarCurr (v3062(VarCurr)-> (all B (range_1_0(B)-> (v3061(VarCurr,B)<->b10(B))))).
% 93.85/93.28  all VarCurr (v3062(VarCurr)<->v3064(VarCurr)|v3066(VarCurr)).
% 93.85/93.28  all VarCurr (v3066(VarCurr)<->v3067(VarCurr)&v3065(VarCurr,bitIndex4)).
% 93.85/93.28  all VarCurr (v3067(VarCurr)<->v3068(VarCurr)|v3069(VarCurr)).
% 93.85/93.28  all VarCurr (v3069(VarCurr)<->v3070(VarCurr)&v3065(VarCurr,bitIndex3)).
% 93.85/93.28  all VarCurr (v3070(VarCurr)<->v3071(VarCurr)|v3072(VarCurr)).
% 93.85/93.28  all VarCurr (v3072(VarCurr)<->v3073(VarCurr)&v3065(VarCurr,bitIndex2)).
% 93.85/93.28  all VarCurr (v3073(VarCurr)<->v3074(VarCurr)|v3075(VarCurr)).
% 93.85/93.28  all VarCurr (v3075(VarCurr)<->v3076(VarCurr)&v3065(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (-v3076(VarCurr)<->v3065(VarCurr,bitIndex0)).
% 93.85/93.28  all VarCurr (-v3074(VarCurr)<->v3065(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (-v3071(VarCurr)<->v3065(VarCurr,bitIndex2)).
% 93.85/93.28  all VarCurr (-v3068(VarCurr)<->v3065(VarCurr,bitIndex3)).
% 93.85/93.28  all VarCurr (-v3064(VarCurr)<->v3065(VarCurr,bitIndex4)).
% 93.85/93.28  all VarCurr (v3065(VarCurr,bitIndex0)<->v3054(VarCurr)).
% 93.85/93.28  all VarCurr (v3065(VarCurr,bitIndex1)<->v3049(VarCurr)).
% 93.85/93.28  all VarCurr (v3065(VarCurr,bitIndex2)<->v3044(VarCurr)).
% 93.85/93.28  all VarCurr (v3065(VarCurr,bitIndex3)<->v3039(VarCurr)).
% 93.85/93.28  all VarCurr (v3065(VarCurr,bitIndex4)<->v3012(VarCurr)).
% 93.85/93.28  all VarCurr (v3060(VarCurr)<-> (v2290(VarCurr,bitIndex4)<->$F)& (v2290(VarCurr,bitIndex3)<->$F)& (v2290(VarCurr,bitIndex2)<->$F)& (v2290(VarCurr,bitIndex1)<->$F)& (v2290(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarCurr (v3057(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  all VarCurr (-v741(VarCurr)-> (all B (range_1_0(B)-> (v3005(VarCurr,B)<->v3006(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v741(VarCurr)-> (all B (range_1_0(B)-> (v3005(VarCurr,B)<->b01(B))))).
% 93.85/93.28  all VarCurr (-v3007(VarCurr)-> (all B (range_1_0(B)-> (v3006(VarCurr,B)<->v3008(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v3007(VarCurr)-> (all B (range_1_0(B)-> (v3006(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (-v3009(VarCurr)-> (all B (range_1_0(B)-> (v3008(VarCurr,B)<->b10(B))))).
% 93.85/93.28  all VarCurr (v3009(VarCurr)-> (all B (range_1_0(B)-> (v3008(VarCurr,B)<->$T)))).
% 93.85/93.28  all VarCurr (v3009(VarCurr)<-> (v3010(VarCurr,bitIndex4)<->$T)& (v3010(VarCurr,bitIndex3)<->$T)& (v3010(VarCurr,bitIndex2)<->$T)& (v3010(VarCurr,bitIndex1)<->$T)& (v3010(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  b11111(bitIndex4).
% 93.85/93.28  b11111(bitIndex3).
% 93.85/93.28  b11111(bitIndex2).
% 93.85/93.28  b11111(bitIndex1).
% 93.85/93.28  b11111(bitIndex0).
% 93.85/93.28  all VarCurr (v3010(VarCurr,bitIndex0)<->v3054(VarCurr)).
% 93.85/93.28  all VarCurr (v3010(VarCurr,bitIndex1)<->v3049(VarCurr)).
% 93.85/93.28  all VarCurr (v3010(VarCurr,bitIndex2)<->v3044(VarCurr)).
% 93.85/93.28  all VarCurr (v3010(VarCurr,bitIndex3)<->v3039(VarCurr)).
% 93.85/93.28  all VarCurr (v3010(VarCurr,bitIndex4)<->v3012(VarCurr)).
% 93.85/93.28  all VarCurr (v3054(VarCurr)<->v3055(VarCurr)&v3056(VarCurr)).
% 93.85/93.28  all VarCurr (v3056(VarCurr)<->v2290(VarCurr,bitIndex0)|v2927(VarCurr,bitIndex0)).
% 93.85/93.28  all VarCurr (v3055(VarCurr)<->v2898(VarCurr)|v2981(VarCurr)).
% 93.85/93.28  all VarCurr (v3049(VarCurr)<->v3050(VarCurr)&v3053(VarCurr)).
% 93.85/93.28  all VarCurr (v3053(VarCurr)<->v3021(VarCurr)|v3022(VarCurr)).
% 93.85/93.28  all VarCurr (v3050(VarCurr)<->v3051(VarCurr)|v3052(VarCurr)).
% 93.85/93.28  all VarCurr (-v3052(VarCurr)<->v3022(VarCurr)).
% 93.85/93.28  all VarCurr (-v3051(VarCurr)<->v3021(VarCurr)).
% 93.85/93.28  all VarCurr (v3044(VarCurr)<->v3045(VarCurr)&v3048(VarCurr)).
% 93.85/93.28  all VarCurr (v3048(VarCurr)<->v3019(VarCurr)|v3026(VarCurr)).
% 93.85/93.28  all VarCurr (v3045(VarCurr)<->v3046(VarCurr)|v3047(VarCurr)).
% 93.85/93.28  all VarCurr (-v3047(VarCurr)<->v3026(VarCurr)).
% 93.85/93.28  all VarCurr (-v3046(VarCurr)<->v3019(VarCurr)).
% 93.85/93.28  all VarCurr (v3039(VarCurr)<->v3040(VarCurr)&v3043(VarCurr)).
% 93.85/93.28  all VarCurr (v3043(VarCurr)<->v3017(VarCurr)|v3030(VarCurr)).
% 93.85/93.28  all VarCurr (v3040(VarCurr)<->v3041(VarCurr)|v3042(VarCurr)).
% 93.85/93.28  all VarCurr (-v3042(VarCurr)<->v3030(VarCurr)).
% 93.85/93.28  all VarCurr (-v3041(VarCurr)<->v3017(VarCurr)).
% 93.85/93.28  all VarCurr (v3012(VarCurr)<->v3013(VarCurr)&v3038(VarCurr)).
% 93.85/93.28  all VarCurr (v3038(VarCurr)<->v3015(VarCurr)|v3035(VarCurr)).
% 93.85/93.28  all VarCurr (v3013(VarCurr)<->v3014(VarCurr)|v3034(VarCurr)).
% 93.85/93.28  all VarCurr (-v3034(VarCurr)<->v3035(VarCurr)).
% 93.85/93.28  all VarCurr (v3035(VarCurr)<->v3036(VarCurr)&v3037(VarCurr)).
% 93.85/93.28  all VarCurr (v3037(VarCurr)<->v2290(VarCurr,bitIndex4)|v2927(VarCurr,bitIndex4)).
% 93.85/93.28  all VarCurr (v3036(VarCurr)<->v2884(VarCurr)|v2967(VarCurr)).
% 93.85/93.28  all VarCurr (-v3014(VarCurr)<->v3015(VarCurr)).
% 93.85/93.28  all VarCurr (v3015(VarCurr)<->v3016(VarCurr)|v3033(VarCurr)).
% 93.85/93.28  all VarCurr (v3033(VarCurr)<->v2290(VarCurr,bitIndex3)&v2927(VarCurr,bitIndex3)).
% 93.85/93.28  all VarCurr (v3016(VarCurr)<->v3017(VarCurr)&v3030(VarCurr)).
% 93.85/93.28  all VarCurr (v3030(VarCurr)<->v3031(VarCurr)&v3032(VarCurr)).
% 93.85/93.28  all VarCurr (v3032(VarCurr)<->v2290(VarCurr,bitIndex3)|v2927(VarCurr,bitIndex3)).
% 93.85/93.28  all VarCurr (v3031(VarCurr)<->v2889(VarCurr)|v2972(VarCurr)).
% 93.85/93.28  all VarCurr (v3017(VarCurr)<->v3018(VarCurr)|v3029(VarCurr)).
% 93.85/93.28  all VarCurr (v3029(VarCurr)<->v2290(VarCurr,bitIndex2)&v2927(VarCurr,bitIndex2)).
% 93.85/93.28  all VarCurr (v3018(VarCurr)<->v3019(VarCurr)&v3026(VarCurr)).
% 93.85/93.28  all VarCurr (v3026(VarCurr)<->v3027(VarCurr)&v3028(VarCurr)).
% 93.85/93.28  all VarCurr (v3028(VarCurr)<->v2290(VarCurr,bitIndex2)|v2927(VarCurr,bitIndex2)).
% 93.85/93.28  all VarCurr (v3027(VarCurr)<->v2894(VarCurr)|v2977(VarCurr)).
% 93.85/93.28  all VarCurr (v3019(VarCurr)<->v3020(VarCurr)|v3025(VarCurr)).
% 93.85/93.28  all VarCurr (v3025(VarCurr)<->v2290(VarCurr,bitIndex1)&v2927(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v3020(VarCurr)<->v3021(VarCurr)&v3022(VarCurr)).
% 93.85/93.28  all VarCurr (v3022(VarCurr)<->v3023(VarCurr)&v3024(VarCurr)).
% 93.85/93.28  all VarCurr (v3024(VarCurr)<->v2290(VarCurr,bitIndex1)|v2927(VarCurr,bitIndex1)).
% 93.85/93.28  all VarCurr (v3023(VarCurr)<->v2899(VarCurr)|v2982(VarCurr)).
% 93.85/93.28  all VarCurr (v3021(VarCurr)<->v2290(VarCurr,bitIndex0)&v2927(VarCurr,bitIndex0)).
% 93.85/93.28  all VarCurr (v3007(VarCurr)<-> (v2290(VarCurr,bitIndex4)<->$F)& (v2290(VarCurr,bitIndex3)<->$F)& (v2290(VarCurr,bitIndex2)<->$F)& (v2290(VarCurr,bitIndex1)<->$F)& (v2290(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarCurr (v3004(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarCurr (-v2994(VarCurr)-> (all B (range_1_0(B)-> (v2993(VarCurr,B)<->v2996(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2994(VarCurr)-> (all B (range_1_0(B)-> (v2993(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (-v2997(VarCurr)-> (all B (range_1_0(B)-> (v2996(VarCurr,B)<->b01(B))))).
% 93.85/93.28  all VarCurr (v2997(VarCurr)-> (all B (range_1_0(B)-> (v2996(VarCurr,B)<->b10(B))))).
% 93.85/93.28  all VarCurr (v2997(VarCurr)<->v320(VarCurr)&v2998(VarCurr)).
% 93.85/93.28  all VarCurr (-v2998(VarCurr)<->v3000(VarCurr)).
% 93.85/93.28  all VarCurr (v3000(VarCurr)<->v3001(VarCurr)&v2884(VarCurr)).
% 93.85/93.28  all VarCurr (v3001(VarCurr)<->v3002(VarCurr)&v2889(VarCurr)).
% 93.85/93.28  all VarCurr (v3002(VarCurr)<->v3003(VarCurr)&v2894(VarCurr)).
% 93.85/93.28  all VarCurr (v3003(VarCurr)<->v2898(VarCurr)&v2899(VarCurr)).
% 93.85/93.28  all VarCurr (v2994(VarCurr)<->v320(VarCurr)&v2995(VarCurr)).
% 93.85/93.28  all VarCurr (v2995(VarCurr)<-> (v2290(VarCurr,bitIndex4)<->$F)& (v2290(VarCurr,bitIndex3)<->$F)& (v2290(VarCurr,bitIndex2)<->$F)& (v2290(VarCurr,bitIndex1)<->$F)& (v2290(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarCurr (v2992(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$F)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.85/93.28  all VarCurr (-v2265(VarCurr)-> (all B (range_1_0(B)-> (v2989(VarCurr,B)<->v2990(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2265(VarCurr)-> (all B (range_1_0(B)-> (v2989(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (-v741(VarCurr)-> (all B (range_1_0(B)-> (v2990(VarCurr,B)<->v2991(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v741(VarCurr)-> (all B (range_1_0(B)-> (v2990(VarCurr,B)<->b01(B))))).
% 93.85/93.28  all VarCurr (-v2275(VarCurr)-> (all B (range_1_0(B)-> (v2991(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (v2275(VarCurr)-> (all B (range_1_0(B)-> (v2991(VarCurr,B)<->b10(B))))).
% 93.85/93.28  all VarCurr (v2988(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$F)& (v2261(VarCurr,bitIndex0)<->$F)).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2940(VarNext)-> (all B (range_4_0(B)-> (v2927(VarNext,B)<->v2927(VarCurr,B)))))).
% 93.85/93.28  all VarNext (v2940(VarNext)-> (all B (range_4_0(B)-> (v2927(VarNext,B)<->v2957(VarNext,B))))).
% 93.85/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v2957(VarNext,B)<->v2955(VarCurr,B))))).
% 93.85/93.28  all VarCurr (-v2952(VarCurr)-> (all B (range_4_0(B)-> (v2955(VarCurr,B)<->v2958(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2952(VarCurr)-> (all B (range_4_0(B)-> (v2955(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (-v2929(VarCurr)-> (all B (range_4_0(B)-> (v2958(VarCurr,B)<->v2959(VarCurr,B))))).
% 93.85/93.28  all VarCurr (v2929(VarCurr)-> (all B (range_4_0(B)-> (v2958(VarCurr,B)<->$F)))).
% 93.85/93.28  all VarCurr (v2959(VarCurr,bitIndex0)<->v2981(VarCurr)).
% 93.85/93.28  all VarCurr (v2959(VarCurr,bitIndex1)<->v2979(VarCurr)).
% 93.91/93.28  all VarCurr (v2959(VarCurr,bitIndex2)<->v2974(VarCurr)).
% 93.91/93.28  all VarCurr (v2959(VarCurr,bitIndex3)<->v2969(VarCurr)).
% 93.91/93.28  all VarCurr (v2959(VarCurr,bitIndex4)<->v2961(VarCurr)).
% 93.91/93.28  all VarCurr (v2979(VarCurr)<->v2980(VarCurr)&v2983(VarCurr)).
% 93.91/93.28  all VarCurr (v2983(VarCurr)<->v2927(VarCurr,bitIndex0)|v2927(VarCurr,bitIndex1)).
% 93.91/93.28  all VarCurr (v2980(VarCurr)<->v2981(VarCurr)|v2982(VarCurr)).
% 93.91/93.28  all VarCurr (-v2982(VarCurr)<->v2927(VarCurr,bitIndex1)).
% 93.91/93.28  all VarCurr (-v2981(VarCurr)<->v2927(VarCurr,bitIndex0)).
% 93.91/93.28  all VarCurr (v2974(VarCurr)<->v2975(VarCurr)&v2978(VarCurr)).
% 93.91/93.28  all VarCurr (v2978(VarCurr)<->v2966(VarCurr)|v2927(VarCurr,bitIndex2)).
% 93.91/93.28  all VarCurr (v2975(VarCurr)<->v2976(VarCurr)|v2977(VarCurr)).
% 93.91/93.28  all VarCurr (-v2977(VarCurr)<->v2927(VarCurr,bitIndex2)).
% 93.91/93.28  all VarCurr (-v2976(VarCurr)<->v2966(VarCurr)).
% 93.91/93.28  all VarCurr (v2969(VarCurr)<->v2970(VarCurr)&v2973(VarCurr)).
% 93.91/93.28  all VarCurr (v2973(VarCurr)<->v2965(VarCurr)|v2927(VarCurr,bitIndex3)).
% 93.91/93.28  all VarCurr (v2970(VarCurr)<->v2971(VarCurr)|v2972(VarCurr)).
% 93.91/93.28  all VarCurr (-v2972(VarCurr)<->v2927(VarCurr,bitIndex3)).
% 93.91/93.28  all VarCurr (-v2971(VarCurr)<->v2965(VarCurr)).
% 93.91/93.28  all VarCurr (v2961(VarCurr)<->v2962(VarCurr)&v2968(VarCurr)).
% 93.91/93.28  all VarCurr (v2968(VarCurr)<->v2964(VarCurr)|v2927(VarCurr,bitIndex4)).
% 93.91/93.28  all VarCurr (v2962(VarCurr)<->v2963(VarCurr)|v2967(VarCurr)).
% 93.91/93.28  all VarCurr (-v2967(VarCurr)<->v2927(VarCurr,bitIndex4)).
% 93.91/93.28  all VarCurr (-v2963(VarCurr)<->v2964(VarCurr)).
% 93.91/93.28  all VarCurr (v2964(VarCurr)<->v2965(VarCurr)&v2927(VarCurr,bitIndex3)).
% 93.91/93.28  all VarCurr (v2965(VarCurr)<->v2966(VarCurr)&v2927(VarCurr,bitIndex2)).
% 93.91/93.28  all VarCurr (v2966(VarCurr)<->v2927(VarCurr,bitIndex0)&v2927(VarCurr,bitIndex1)).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2940(VarNext)<->v2941(VarNext)&v2948(VarNext))).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2948(VarNext)<->v2946(VarCurr))).
% 93.91/93.28  all VarCurr (v2946(VarCurr)<->v2949(VarCurr)|v2952(VarCurr)).
% 93.91/93.28  all VarCurr (v2952(VarCurr)<->v2953(VarCurr)|v2954(VarCurr)).
% 93.91/93.28  all VarCurr (-v2954(VarCurr)<->v1908(VarCurr)).
% 93.91/93.28  all VarCurr (-v2953(VarCurr)<->v12(VarCurr)).
% 93.91/93.28  all VarCurr (v2949(VarCurr)<->v2950(VarCurr)|v2929(VarCurr)).
% 93.91/93.28  all VarCurr (v2950(VarCurr)<->v2265(VarCurr)&v2951(VarCurr)).
% 93.91/93.28  all VarCurr (v2951(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$F)).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2941(VarNext)<->v2942(VarNext)&v288(VarNext))).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2942(VarNext)<->v1891(VarNext))).
% 93.91/93.28  all B (range_4_0(B)-> (v2927(constB0,B)<->$F)).
% 93.91/93.28  all VarCurr (v2929(VarCurr)<->v2931(VarCurr)|v2933(VarCurr)).
% 93.91/93.28  all VarCurr (v2933(VarCurr)<->v2934(VarCurr)&v2937(VarCurr)).
% 93.91/93.28  all VarCurr (v2937(VarCurr)<-> (v2290(VarCurr,bitIndex4)<->$F)& (v2290(VarCurr,bitIndex3)<->$F)& (v2290(VarCurr,bitIndex2)<->$F)& (v2290(VarCurr,bitIndex1)<->$F)& (v2290(VarCurr,bitIndex0)<->$F)).
% 93.91/93.28  all VarCurr (v2934(VarCurr)<->v2935(VarCurr)|v2936(VarCurr)).
% 93.91/93.28  all VarCurr (v2936(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.91/93.28  all VarCurr (v2935(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$T)& (v2261(VarCurr,bitIndex0)<->$F)).
% 93.91/93.28  all VarCurr (v2931(VarCurr)<->v2932(VarCurr)&v320(VarCurr)).
% 93.91/93.28  all VarCurr (v2932(VarCurr)<-> (v2261(VarCurr,bitIndex1)<->$F)& (v2261(VarCurr,bitIndex0)<->$T)).
% 93.91/93.28  all B (range_1_0(B)-> (v2261(constB0,B)<->$F)).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2855(VarNext)-> (all B (range_4_0(B)-> (v2290(VarNext,B)<->v2290(VarCurr,B)))))).
% 93.91/93.28  all VarNext (v2855(VarNext)-> (all B (range_4_0(B)-> (v2290(VarNext,B)<->v2874(VarNext,B))))).
% 93.91/93.28  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v2874(VarNext,B)<->v2872(VarCurr,B))))).
% 93.91/93.28  all VarCurr (-v2869(VarCurr)-> (all B (range_4_0(B)-> (v2872(VarCurr,B)<->v2875(VarCurr,B))))).
% 93.91/93.28  all VarCurr (v2869(VarCurr)-> (all B (range_4_0(B)-> (v2872(VarCurr,B)<->$F)))).
% 93.91/93.28  all VarCurr (-v2867(VarCurr)-> (all B (range_4_0(B)-> (v2875(VarCurr,B)<->v2901(VarCurr,B))))).
% 93.91/93.28  all VarCurr (v2867(VarCurr)-> (all B (range_4_0(B)-> (v2875(VarCurr,B)<->v2876(VarCurr,B))))).
% 93.91/93.28  all VarCurr (v2901(VarCurr,bitIndex0)<->v2898(VarCurr)).
% 93.91/93.29  all VarCurr (v2901(VarCurr,bitIndex1)<->v2921(VarCurr)).
% 93.91/93.29  all VarCurr (v2901(VarCurr,bitIndex2)<->v2917(VarCurr)).
% 93.91/93.29  all VarCurr (v2901(VarCurr,bitIndex3)<->v2913(VarCurr)).
% 93.91/93.29  all VarCurr (v2901(VarCurr,bitIndex4)<->v2903(VarCurr)).
% 93.91/93.29  all VarCurr (v2921(VarCurr)<->v2922(VarCurr)&v2923(VarCurr)).
% 93.91/93.29  all VarCurr (v2923(VarCurr)<->v2290(VarCurr,bitIndex0)|v2899(VarCurr)).
% 93.91/93.29  all VarCurr (v2922(VarCurr)<->v2898(VarCurr)|v2290(VarCurr,bitIndex1)).
% 93.91/93.29  all VarCurr (v2917(VarCurr)<->v2918(VarCurr)&v2920(VarCurr)).
% 93.91/93.29  all VarCurr (v2920(VarCurr)<->v2894(VarCurr)|v2910(VarCurr)).
% 93.91/93.29  all VarCurr (v2918(VarCurr)<->v2290(VarCurr,bitIndex2)|v2919(VarCurr)).
% 93.91/93.29  all VarCurr (-v2919(VarCurr)<->v2910(VarCurr)).
% 93.91/93.29  all VarCurr (v2913(VarCurr)<->v2914(VarCurr)&v2916(VarCurr)).
% 93.91/93.29  all VarCurr (v2916(VarCurr)<->v2889(VarCurr)|v2908(VarCurr)).
% 93.91/93.29  all VarCurr (v2914(VarCurr)<->v2290(VarCurr,bitIndex3)|v2915(VarCurr)).
% 93.91/93.29  all VarCurr (-v2915(VarCurr)<->v2908(VarCurr)).
% 93.91/93.29  all VarCurr (v2903(VarCurr)<->v2904(VarCurr)&v2912(VarCurr)).
% 93.91/93.29  all VarCurr (v2912(VarCurr)<->v2884(VarCurr)|v2906(VarCurr)).
% 93.91/93.29  all VarCurr (v2904(VarCurr)<->v2290(VarCurr,bitIndex4)|v2905(VarCurr)).
% 93.91/93.29  all VarCurr (-v2905(VarCurr)<->v2906(VarCurr)).
% 93.91/93.29  all VarCurr (v2906(VarCurr)<->v2290(VarCurr,bitIndex3)|v2907(VarCurr)).
% 93.91/93.29  all VarCurr (v2907(VarCurr)<->v2889(VarCurr)&v2908(VarCurr)).
% 93.91/93.29  all VarCurr (v2908(VarCurr)<->v2290(VarCurr,bitIndex2)|v2909(VarCurr)).
% 93.91/93.29  all VarCurr (v2909(VarCurr)<->v2894(VarCurr)&v2910(VarCurr)).
% 93.91/93.29  all VarCurr (v2910(VarCurr)<->v2290(VarCurr,bitIndex1)|v2911(VarCurr)).
% 93.91/93.29  all VarCurr (v2911(VarCurr)<->v2290(VarCurr,bitIndex0)&v2899(VarCurr)).
% 93.91/93.29  all VarCurr (v2876(VarCurr,bitIndex0)<->v2898(VarCurr)).
% 93.91/93.29  all VarCurr (v2876(VarCurr,bitIndex1)<->v2896(VarCurr)).
% 93.91/93.29  all VarCurr (v2876(VarCurr,bitIndex2)<->v2891(VarCurr)).
% 93.91/93.29  all VarCurr (v2876(VarCurr,bitIndex3)<->v2886(VarCurr)).
% 93.91/93.29  all VarCurr (v2876(VarCurr,bitIndex4)<->v2878(VarCurr)).
% 93.91/93.29  all VarCurr (v2896(VarCurr)<->v2897(VarCurr)&v2900(VarCurr)).
% 93.91/93.29  all VarCurr (v2900(VarCurr)<->v2290(VarCurr,bitIndex0)|v2290(VarCurr,bitIndex1)).
% 93.91/93.29  all VarCurr (v2897(VarCurr)<->v2898(VarCurr)|v2899(VarCurr)).
% 93.91/93.29  all VarCurr (-v2899(VarCurr)<->v2290(VarCurr,bitIndex1)).
% 93.91/93.29  all VarCurr (-v2898(VarCurr)<->v2290(VarCurr,bitIndex0)).
% 93.91/93.29  all VarCurr (v2891(VarCurr)<->v2892(VarCurr)&v2895(VarCurr)).
% 93.91/93.29  all VarCurr (v2895(VarCurr)<->v2883(VarCurr)|v2290(VarCurr,bitIndex2)).
% 93.91/93.29  all VarCurr (v2892(VarCurr)<->v2893(VarCurr)|v2894(VarCurr)).
% 93.91/93.29  all VarCurr (-v2894(VarCurr)<->v2290(VarCurr,bitIndex2)).
% 93.91/93.29  all VarCurr (-v2893(VarCurr)<->v2883(VarCurr)).
% 93.91/93.29  all VarCurr (v2886(VarCurr)<->v2887(VarCurr)&v2890(VarCurr)).
% 93.91/93.29  all VarCurr (v2890(VarCurr)<->v2882(VarCurr)|v2290(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (v2887(VarCurr)<->v2888(VarCurr)|v2889(VarCurr)).
% 93.91/93.29  all VarCurr (-v2889(VarCurr)<->v2290(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (-v2888(VarCurr)<->v2882(VarCurr)).
% 93.91/93.29  all VarCurr (v2878(VarCurr)<->v2879(VarCurr)&v2885(VarCurr)).
% 93.91/93.29  all VarCurr (v2885(VarCurr)<->v2881(VarCurr)|v2290(VarCurr,bitIndex4)).
% 93.91/93.29  all VarCurr (v2879(VarCurr)<->v2880(VarCurr)|v2884(VarCurr)).
% 93.91/93.29  all VarCurr (-v2884(VarCurr)<->v2290(VarCurr,bitIndex4)).
% 93.91/93.29  all VarCurr (-v2880(VarCurr)<->v2881(VarCurr)).
% 93.91/93.29  all VarCurr (v2881(VarCurr)<->v2882(VarCurr)&v2290(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (v2882(VarCurr)<->v2883(VarCurr)&v2290(VarCurr,bitIndex2)).
% 93.91/93.29  all VarCurr (v2883(VarCurr)<->v2290(VarCurr,bitIndex0)&v2290(VarCurr,bitIndex1)).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2855(VarNext)<->v2856(VarNext)&v2863(VarNext))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2863(VarNext)<->v2861(VarCurr))).
% 93.91/93.29  all VarCurr (v2861(VarCurr)<->v2864(VarCurr)|v2869(VarCurr)).
% 93.91/93.29  all VarCurr (v2869(VarCurr)<->v2870(VarCurr)|v2871(VarCurr)).
% 93.91/93.29  all VarCurr (-v2871(VarCurr)<->v1908(VarCurr)).
% 93.91/93.29  all VarCurr (-v2870(VarCurr)<->v12(VarCurr)).
% 93.91/93.29  all VarCurr (v2864(VarCurr)<->v2865(VarCurr)|v2867(VarCurr)).
% 93.91/93.29  all VarCurr (v2867(VarCurr)<->v2275(VarCurr)&v2868(VarCurr)).
% 93.91/93.29  all VarCurr (-v2868(VarCurr)<->v2292(VarCurr)).
% 93.91/93.29  all VarCurr (v2865(VarCurr)<->v2866(VarCurr)&v2292(VarCurr)).
% 93.91/93.29  all VarCurr (-v2866(VarCurr)<->v2275(VarCurr)).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2856(VarNext)<->v2857(VarNext)&v288(VarNext))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2857(VarNext)<->v1891(VarNext))).
% 93.91/93.29  all B (range_4_0(B)-> (v2290(constB0,B)<->$F)).
% 93.91/93.29  all VarCurr (v2292(VarCurr)<->v2294(VarCurr)&v2852(VarCurr)).
% 93.91/93.29  all VarCurr (v2852(VarCurr)<-> (v2775(VarCurr)<->$T)).
% 93.91/93.29  all VarCurr (v2775(VarCurr)<->v2777(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (v2777(VarCurr,bitIndex3)<->v2779(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (v2779(VarCurr,bitIndex3)<->v2781(VarCurr,bitIndex3)).
% 93.91/93.29  all VarNext (v2781(VarNext,bitIndex3)<->v2836(VarNext,bitIndex3)).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2837(VarNext)-> (all B (range_3_0(B)-> (v2836(VarNext,B)<->v2781(VarCurr,B)))))).
% 93.91/93.29  all VarNext (v2837(VarNext)-> (all B (range_3_0(B)-> (v2836(VarNext,B)<->v2847(VarNext,B))))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v2847(VarNext,B)<->v2845(VarCurr,B))))).
% 93.91/93.29  all VarCurr (-v2848(VarCurr)-> (all B (range_3_0(B)-> (v2845(VarCurr,B)<->v2785(VarCurr,B))))).
% 93.91/93.29  all VarCurr (v2848(VarCurr)-> (all B (range_3_0(B)-> (v2845(VarCurr,B)<->$F)))).
% 93.91/93.29  all VarCurr (-v2848(VarCurr)<->v2783(VarCurr)).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2837(VarNext)<->v2838(VarNext))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2838(VarNext)<->v2839(VarNext)&v2834(VarNext))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2839(VarNext)<->v2841(VarNext))).
% 93.91/93.29  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2841(VarNext)<->v2834(VarCurr))).
% 93.91/93.29  all VarCurr (v2834(VarCurr)<->v195(VarCurr)).
% 93.91/93.29  all VarCurr (v2785(VarCurr,bitIndex3)<->v2832(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (-v2787(VarCurr)-> (all B (range_3_0(B)-> (v2832(VarCurr,B)<->v2793(VarCurr,B))))).
% 93.91/93.29  all VarCurr (v2787(VarCurr)-> (all B (range_3_0(B)-> (v2832(VarCurr,B)<->b0011(B))))).
% 93.91/93.29  all VarCurr (v2793(VarCurr,bitIndex3)<->v2804(VarCurr,bitIndex3)).
% 93.91/93.29  all VarCurr (-v2805(VarCurr)-> (all B (range_3_0(B)-> (v2804(VarCurr,B)<->$F)))).
% 93.91/93.29  all VarCurr (v2805(VarCurr)-> (all B (range_3_0(B)-> (v2804(VarCurr,B)<->v2828(VarCurr,B))))).
% 93.91/93.29  all VarCurr (-v2810(VarCurr)& -v2812(VarCurr)& -v2815(VarCurr)& -v2822(VarCurr)& -v2823(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->v2831(VarCurr,B))))).
% 93.91/93.29  all VarCurr (v2823(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->v2830(VarCurr,B))))).
% 93.91/93.29  all VarCurr (v2822(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->b0100(B))))).
% 93.91/93.29  all VarCurr (v2815(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->$F)))).
% 93.91/93.29  all VarCurr (v2812(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->v2829(VarCurr,B))))).
% 93.91/93.29  all VarCurr (v2810(VarCurr)-> (all B (range_3_0(B)-> (v2828(VarCurr,B)<->b0010(B))))).
% 93.91/93.29  all VarCurr (-v2803(VarCurr)-> (all B (range_3_0(B)-> (v2831(VarCurr,B)<->b1001(B))))).
% 93.91/93.29  all VarCurr (v2803(VarCurr)-> (all B (range_3_0(B)-> (v2831(VarCurr,B)<->b1000(B))))).
% 93.91/93.29  all VarCurr (-v2825(VarCurr)-> (all B (range_3_0(B)-> (v2830(VarCurr,B)<->b1010(B))))).
% 93.91/93.29  all VarCurr (v2825(VarCurr)-> (all B (range_3_0(B)-> (v2830(VarCurr,B)<->b1011(B))))).
% 93.91/93.29  all VarCurr (-v2803(VarCurr)-> (all B (range_3_0(B)-> (v2829(VarCurr,B)<->$F)))).
% 93.91/93.29  all VarCurr (v2803(VarCurr)-> (all B (range_3_0(B)-> (v2829(VarCurr,B)<->b0001(B))))).
% 93.91/93.29  all VarCurr (v2805(VarCurr)<->v2806(VarCurr)|v2827(VarCurr)).
% 93.91/93.29  all VarCurr (v2827(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$T)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1001010(bitIndex6).
% 93.91/93.29  -b1001010(bitIndex5).
% 93.91/93.29  -b1001010(bitIndex4).
% 93.91/93.29  b1001010(bitIndex3).
% 93.91/93.29  -b1001010(bitIndex2).
% 93.91/93.29  b1001010(bitIndex1).
% 93.91/93.29  -b1001010(bitIndex0).
% 93.91/93.29  all VarCurr (v2806(VarCurr)<->v2807(VarCurr)|v2823(VarCurr)).
% 93.91/93.29  all VarCurr (v2823(VarCurr)<->v2824(VarCurr)&v2750(VarCurr)).
% 93.91/93.29  all VarCurr (v2824(VarCurr)<->v2825(VarCurr)|v2826(VarCurr)).
% 93.91/93.29  all VarCurr (v2826(VarCurr)<-> (v145(VarCurr,bitIndex2)<->$T)& (v145(VarCurr,bitIndex1)<->$T)& (v145(VarCurr,bitIndex0)<->$T)).
% 93.91/93.29  all VarCurr (v2825(VarCurr)<-> (v145(VarCurr,bitIndex2)<->$F)& (v145(VarCurr,bitIndex1)<->$F)& (v145(VarCurr,bitIndex0)<->$T)).
% 93.91/93.29  all VarCurr (v2807(VarCurr)<->v2808(VarCurr)|v2822(VarCurr)).
% 93.91/93.29  all VarCurr (v2822(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$T)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$T)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1111010(bitIndex6).
% 93.91/93.29  b1111010(bitIndex5).
% 93.91/93.29  b1111010(bitIndex4).
% 93.91/93.29  b1111010(bitIndex3).
% 93.91/93.29  -b1111010(bitIndex2).
% 93.91/93.29  b1111010(bitIndex1).
% 93.91/93.29  -b1111010(bitIndex0).
% 93.91/93.29  all VarCurr (v2808(VarCurr)<->v2809(VarCurr)|v2815(VarCurr)).
% 93.91/93.29  all VarCurr (v2815(VarCurr)<->v2816(VarCurr)|v2821(VarCurr)).
% 93.91/93.29  all VarCurr (v2821(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$T)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1110000(bitIndex6).
% 93.91/93.29  b1110000(bitIndex5).
% 93.91/93.29  b1110000(bitIndex4).
% 93.91/93.29  -b1110000(bitIndex3).
% 93.91/93.29  -b1110000(bitIndex2).
% 93.91/93.29  -b1110000(bitIndex1).
% 93.91/93.29  -b1110000(bitIndex0).
% 93.91/93.29  all VarCurr (v2816(VarCurr)<->v2817(VarCurr)|v2820(VarCurr)).
% 93.91/93.29  all VarCurr (v2820(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$T)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1010000(bitIndex6).
% 93.91/93.29  -b1010000(bitIndex5).
% 93.91/93.29  b1010000(bitIndex4).
% 93.91/93.29  -b1010000(bitIndex3).
% 93.91/93.29  -b1010000(bitIndex2).
% 93.91/93.29  -b1010000(bitIndex1).
% 93.91/93.29  -b1010000(bitIndex0).
% 93.91/93.29  all VarCurr (v2817(VarCurr)<->v2818(VarCurr)|v2819(VarCurr)).
% 93.91/93.29  all VarCurr (v2819(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$T)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1111000(bitIndex6).
% 93.91/93.29  b1111000(bitIndex5).
% 93.91/93.29  b1111000(bitIndex4).
% 93.91/93.29  b1111000(bitIndex3).
% 93.91/93.29  -b1111000(bitIndex2).
% 93.91/93.29  -b1111000(bitIndex1).
% 93.91/93.29  -b1111000(bitIndex0).
% 93.91/93.29  all VarCurr (v2818(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$T)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1011000(bitIndex6).
% 93.91/93.29  -b1011000(bitIndex5).
% 93.91/93.29  b1011000(bitIndex4).
% 93.91/93.29  b1011000(bitIndex3).
% 93.91/93.29  -b1011000(bitIndex2).
% 93.91/93.29  -b1011000(bitIndex1).
% 93.91/93.29  -b1011000(bitIndex0).
% 93.91/93.29  all VarCurr (v2809(VarCurr)<->v2810(VarCurr)|v2812(VarCurr)).
% 93.91/93.29  all VarCurr (v2812(VarCurr)<->v2813(VarCurr)|v2814(VarCurr)).
% 93.91/93.29  all VarCurr (v2814(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1100000(bitIndex6).
% 93.91/93.29  b1100000(bitIndex5).
% 93.91/93.29  -b1100000(bitIndex4).
% 93.91/93.29  -b1100000(bitIndex3).
% 93.91/93.29  -b1100000(bitIndex2).
% 93.91/93.29  -b1100000(bitIndex1).
% 93.91/93.29  -b1100000(bitIndex0).
% 93.91/93.29  all VarCurr (v2813(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$T)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.29  b1000000(bitIndex6).
% 93.91/93.29  -b1000000(bitIndex5).
% 93.91/93.29  -b1000000(bitIndex4).
% 93.91/93.29  -b1000000(bitIndex3).
% 93.91/93.29  -b1000000(bitIndex2).
% 93.91/93.29  -b1000000(bitIndex1).
% 93.91/93.29  -b1000000(bitIndex0).
% 93.91/93.29  all VarCurr (v2810(VarCurr)<->v2811(VarCurr)&v170(VarCurr)).
% 93.91/93.29  all VarCurr (-v2811(VarCurr)<->v145(VarCurr,bitIndex0)).
% 93.91/93.29  all VarCurr (v2803(VarCurr)<->v2424(VarCurr)).
% 93.91/93.29  all VarCurr B (range_2_1(B)-> (v145(VarCurr,B)<->v147(VarCurr,B))).
% 93.91/93.29  all B (range_2_1(B)<->bitIndex1=B|bitIndex2=B).
% 93.91/93.29  all VarCurr ((v147(VarCurr,bitIndex2)<->v149(VarCurr,bitIndex14))& (v147(VarCurr,bitIndex1)<->v149(VarCurr,bitIndex13))).
% 93.91/93.29  all VarCurr B (range_14_13(B)-> (v149(VarCurr,B)<->v151(VarCurr,B))).
% 93.91/93.29  all VarCurr B (range_14_13(B)-> (v151(VarCurr,B)<->v156(VarCurr,B))).
% 93.91/93.29  all B (range_14_13(B)<->bitIndex13=B|bitIndex14=B).
% 93.91/93.29  all VarCurr (v2787(VarCurr)<->v2789(VarCurr)).
% 93.91/93.29  all VarCurr (v2789(VarCurr)<->v2791(VarCurr)).
% 93.91/93.29  all VarCurr (v2791(VarCurr)<->v2412(VarCurr)).
% 93.91/93.29  all VarCurr (v2783(VarCurr)<->v125(VarCurr)).
% 93.91/93.29  all VarCurr (v2294(VarCurr)<->v2296(VarCurr)).
% 93.91/93.29  all VarCurr (v2296(VarCurr)<->v2298(VarCurr)).
% 93.91/93.29  all VarCurr (v2298(VarCurr)<->v2300(VarCurr)).
% 93.91/93.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2756(VarNext)-> (v2300(VarNext)<->v2300(VarCurr)))).
% 93.91/93.30  all VarNext (v2756(VarNext)-> (v2300(VarNext)<->v2764(VarNext))).
% 93.91/93.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2764(VarNext)<->v2762(VarCurr))).
% 93.91/93.30  all VarCurr (-v2765(VarCurr)-> (v2762(VarCurr)<->v2766(VarCurr))).
% 93.91/93.30  all VarCurr (v2765(VarCurr)-> (v2762(VarCurr)<->$F)).
% 93.91/93.30  all VarCurr (v2766(VarCurr)<->v2767(VarCurr)|v2741(VarCurr)).
% 93.91/93.30  all VarCurr (v2767(VarCurr)<->v2768(VarCurr)|v2302(VarCurr,bitIndex12)).
% 93.91/93.30  all VarCurr (v2768(VarCurr)<->v2769(VarCurr)|v2418(VarCurr)).
% 93.91/93.30  all VarCurr (v2769(VarCurr)<->v2770(VarCurr)|v2412(VarCurr)).
% 93.91/93.30  all VarCurr (v2770(VarCurr)<->v2771(VarCurr)|v2302(VarCurr,bitIndex9)).
% 93.91/93.30  all VarCurr (v2771(VarCurr)<->v2302(VarCurr,bitIndex3)|v2302(VarCurr,bitIndex6)).
% 93.91/93.30  all VarCurr (-v2765(VarCurr)<->v123(VarCurr)).
% 93.91/93.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2756(VarNext)<->v2757(VarNext))).
% 93.91/93.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2757(VarNext)<->v2758(VarNext)&v193(VarNext))).
% 93.91/93.30  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2758(VarNext)<->v204(VarNext))).
% 93.91/93.30  all VarCurr (v2741(VarCurr)<->v2752(VarCurr)&v2753(VarCurr)).
% 93.91/93.30  all VarCurr (-v2753(VarCurr)<->v2322(VarCurr)).
% 93.91/93.30  all VarCurr (v2752(VarCurr)<->v2304(VarCurr)&v2743(VarCurr)).
% 93.91/93.30  all VarCurr (v2743(VarCurr)<->v2745(VarCurr)).
% 93.91/93.30  all VarCurr (v2745(VarCurr)<->v2747(VarCurr)).
% 93.91/93.30  all VarCurr (-v2750(VarCurr)-> (v2747(VarCurr)<->$F)).
% 93.91/93.30  all VarCurr (v2750(VarCurr)-> (v2747(VarCurr)<->$T)).
% 93.91/93.30  all VarCurr (v2750(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$T)& (v158(VarCurr,bitIndex0)<->$F)).
% 93.91/93.30  -b0001010(bitIndex6).
% 93.91/93.30  -b0001010(bitIndex5).
% 93.91/93.30  -b0001010(bitIndex4).
% 93.91/93.30  b0001010(bitIndex3).
% 93.91/93.30  -b0001010(bitIndex2).
% 93.91/93.30  b0001010(bitIndex1).
% 93.91/93.30  -b0001010(bitIndex0).
% 93.91/93.30  all VarCurr (-v2737(VarCurr)-> (v2302(VarCurr,bitIndex12)<->$F)).
% 93.91/93.30  all VarCurr (v2737(VarCurr)-> (v2302(VarCurr,bitIndex12)<->$T)).
% 93.91/93.30  all VarCurr (v2737(VarCurr)<->v2738(VarCurr)&v2739(VarCurr)).
% 93.91/93.30  all VarCurr (v2739(VarCurr)<-> ($T<->v2397(VarCurr,bitIndex11))).
% 93.91/93.30  all VarCurr (v2738(VarCurr)<->v2365(VarCurr)&v2304(VarCurr)).
% 93.91/93.30  all VarCurr (v2418(VarCurr)<->v2420(VarCurr)|v2732(VarCurr)).
% 93.91/93.30  all VarCurr (v2732(VarCurr)<->v2734(VarCurr)&v2426(VarCurr)).
% 93.91/93.30  all VarCurr (-v2734(VarCurr)<->v2422(VarCurr)).
% 93.91/93.30  all VarCurr (v2420(VarCurr)<->v2730(VarCurr)&v2441(VarCurr)).
% 93.91/93.30  all VarCurr (v2730(VarCurr)<->v2422(VarCurr)&v2426(VarCurr)).
% 93.91/93.30  all VarCurr (v2441(VarCurr)<->v2443(VarCurr)).
% 93.91/93.30  all VarCurr (v2443(VarCurr)<->v2445(VarCurr)).
% 93.91/93.30  all VarCurr (v2445(VarCurr)<->v2722(VarCurr)&v2447(VarCurr,bitIndex8)).
% 93.91/93.30  all VarCurr (v2722(VarCurr)<->v2723(VarCurr)&v2447(VarCurr,bitIndex7)).
% 93.91/93.30  all VarCurr (v2723(VarCurr)<->v2724(VarCurr)&v2447(VarCurr,bitIndex6)).
% 93.91/93.30  all VarCurr (v2724(VarCurr)<->v2725(VarCurr)&v2447(VarCurr,bitIndex5)).
% 93.91/93.30  all VarCurr (v2725(VarCurr)<->v2726(VarCurr)&v2447(VarCurr,bitIndex4)).
% 93.91/93.30  all VarCurr (v2726(VarCurr)<->v2727(VarCurr)&v2447(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2727(VarCurr)<->v2728(VarCurr)&v2447(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2728(VarCurr)<->v2447(VarCurr,bitIndex0)&v2447(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr ((v2447(VarCurr,bitIndex8)<->v2655(VarCurr,bitIndex16))& (v2447(VarCurr,bitIndex7)<->v2655(VarCurr,bitIndex15))& (v2447(VarCurr,bitIndex6)<->v2655(VarCurr,bitIndex14))& (v2447(VarCurr,bitIndex5)<->v2655(VarCurr,bitIndex13))& (v2447(VarCurr,bitIndex4)<->v2655(VarCurr,bitIndex12))& (v2447(VarCurr,bitIndex3)<->v2655(VarCurr,bitIndex11))& (v2447(VarCurr,bitIndex2)<->v2655(VarCurr,bitIndex10))& (v2447(VarCurr,bitIndex1)<->v2655(VarCurr,bitIndex9))& (v2447(VarCurr,bitIndex0)<->v2655(VarCurr,bitIndex8))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2655(VarCurr,B)<->v2657(VarCurr,B)|v2717(VarCurr,B))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2717(VarCurr,B)<->v2718(VarCurr,B)&v2719(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex0)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex1)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex2)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex3)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex4)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex5)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex6)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex7)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex8)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex9)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex10)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex11)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex12)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex13)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex14)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex15)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2719(VarCurr,bitIndex16)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr B (range_7_0(B)-> (v2718(VarCurr,B)<->$F)).
% 93.91/93.30  all VarCurr ((v2718(VarCurr,bitIndex16)<->v2658(VarCurr,bitIndex8))& (v2718(VarCurr,bitIndex15)<->v2658(VarCurr,bitIndex7))& (v2718(VarCurr,bitIndex14)<->v2658(VarCurr,bitIndex6))& (v2718(VarCurr,bitIndex13)<->v2658(VarCurr,bitIndex5))& (v2718(VarCurr,bitIndex12)<->v2658(VarCurr,bitIndex4))& (v2718(VarCurr,bitIndex11)<->v2658(VarCurr,bitIndex3))& (v2718(VarCurr,bitIndex10)<->v2658(VarCurr,bitIndex2))& (v2718(VarCurr,bitIndex9)<->v2658(VarCurr,bitIndex1))& (v2718(VarCurr,bitIndex8)<->v2658(VarCurr,bitIndex0))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2657(VarCurr,B)<->v2658(VarCurr,B)&v2715(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex0)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex1)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex2)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex3)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex4)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex5)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex6)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex7)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex8)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex9)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex10)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex11)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex12)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex13)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex14)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex15)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (v2715(VarCurr,bitIndex16)<->v2716(VarCurr)).
% 93.91/93.30  all VarCurr (-v2716(VarCurr)<->v2667(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2658(VarCurr,B)<->v2659(VarCurr,B)|v2712(VarCurr,B))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2712(VarCurr,B)<->v2713(VarCurr,B)&v2714(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex0)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex1)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex2)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex3)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex4)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex5)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex6)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex7)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex8)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex9)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex10)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex11)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex12)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex13)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex14)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex15)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2714(VarCurr,bitIndex16)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr B (range_3_0(B)-> (v2713(VarCurr,B)<->$F)).
% 93.91/93.30  all VarCurr ((v2713(VarCurr,bitIndex16)<->v2660(VarCurr,bitIndex12))& (v2713(VarCurr,bitIndex15)<->v2660(VarCurr,bitIndex11))& (v2713(VarCurr,bitIndex14)<->v2660(VarCurr,bitIndex10))& (v2713(VarCurr,bitIndex13)<->v2660(VarCurr,bitIndex9))& (v2713(VarCurr,bitIndex12)<->v2660(VarCurr,bitIndex8))& (v2713(VarCurr,bitIndex11)<->v2660(VarCurr,bitIndex7))& (v2713(VarCurr,bitIndex10)<->v2660(VarCurr,bitIndex6))& (v2713(VarCurr,bitIndex9)<->v2660(VarCurr,bitIndex5))& (v2713(VarCurr,bitIndex8)<->v2660(VarCurr,bitIndex4))& (v2713(VarCurr,bitIndex7)<->v2660(VarCurr,bitIndex3))& (v2713(VarCurr,bitIndex6)<->v2660(VarCurr,bitIndex2))& (v2713(VarCurr,bitIndex5)<->v2660(VarCurr,bitIndex1))& (v2713(VarCurr,bitIndex4)<->v2660(VarCurr,bitIndex0))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2659(VarCurr,B)<->v2660(VarCurr,B)&v2710(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex0)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex1)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex2)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex3)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex4)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex5)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex6)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex7)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex8)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex9)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex10)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex11)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex12)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex13)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex14)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex15)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (v2710(VarCurr,bitIndex16)<->v2711(VarCurr)).
% 93.91/93.30  all VarCurr (-v2711(VarCurr)<->v2667(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2660(VarCurr,B)<->v2661(VarCurr,B)|v2707(VarCurr,B))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2707(VarCurr,B)<->v2708(VarCurr,B)&v2709(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex0)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex1)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex2)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex3)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex4)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex5)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex6)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex7)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex8)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex9)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex10)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex11)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex12)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex13)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex14)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex15)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2709(VarCurr,bitIndex16)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr B (range_1_0(B)-> (v2708(VarCurr,B)<->$F)).
% 93.91/93.30  all VarCurr ((v2708(VarCurr,bitIndex16)<->v2662(VarCurr,bitIndex14))& (v2708(VarCurr,bitIndex15)<->v2662(VarCurr,bitIndex13))& (v2708(VarCurr,bitIndex14)<->v2662(VarCurr,bitIndex12))& (v2708(VarCurr,bitIndex13)<->v2662(VarCurr,bitIndex11))& (v2708(VarCurr,bitIndex12)<->v2662(VarCurr,bitIndex10))& (v2708(VarCurr,bitIndex11)<->v2662(VarCurr,bitIndex9))& (v2708(VarCurr,bitIndex10)<->v2662(VarCurr,bitIndex8))& (v2708(VarCurr,bitIndex9)<->v2662(VarCurr,bitIndex7))& (v2708(VarCurr,bitIndex8)<->v2662(VarCurr,bitIndex6))& (v2708(VarCurr,bitIndex7)<->v2662(VarCurr,bitIndex5))& (v2708(VarCurr,bitIndex6)<->v2662(VarCurr,bitIndex4))& (v2708(VarCurr,bitIndex5)<->v2662(VarCurr,bitIndex3))& (v2708(VarCurr,bitIndex4)<->v2662(VarCurr,bitIndex2))& (v2708(VarCurr,bitIndex3)<->v2662(VarCurr,bitIndex1))& (v2708(VarCurr,bitIndex2)<->v2662(VarCurr,bitIndex0))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2661(VarCurr,B)<->v2662(VarCurr,B)&v2705(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex0)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex1)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex2)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex3)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex4)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex5)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex6)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex7)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex8)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex9)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex10)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex11)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex12)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex13)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex14)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex15)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (v2705(VarCurr,bitIndex16)<->v2706(VarCurr)).
% 93.91/93.30  all VarCurr (-v2706(VarCurr)<->v2667(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2662(VarCurr,B)<->v2663(VarCurr,B)|v2702(VarCurr,B))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2702(VarCurr,B)<->v2703(VarCurr,B)&v2704(VarCurr,B))).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex0)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex1)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex2)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex3)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex4)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex5)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex6)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex7)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex8)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex9)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex10)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex11)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex12)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex13)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex14)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex15)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2704(VarCurr,bitIndex16)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2703(VarCurr,bitIndex0)<->$F).
% 93.91/93.30  all VarCurr ((v2703(VarCurr,bitIndex16)<->v2664(VarCurr,bitIndex15))& (v2703(VarCurr,bitIndex15)<->v2664(VarCurr,bitIndex14))& (v2703(VarCurr,bitIndex14)<->v2664(VarCurr,bitIndex13))& (v2703(VarCurr,bitIndex13)<->v2664(VarCurr,bitIndex12))& (v2703(VarCurr,bitIndex12)<->v2664(VarCurr,bitIndex11))& (v2703(VarCurr,bitIndex11)<->v2664(VarCurr,bitIndex10))& (v2703(VarCurr,bitIndex10)<->v2664(VarCurr,bitIndex9))& (v2703(VarCurr,bitIndex9)<->v2664(VarCurr,bitIndex8))& (v2703(VarCurr,bitIndex8)<->v2664(VarCurr,bitIndex7))& (v2703(VarCurr,bitIndex7)<->v2664(VarCurr,bitIndex6))& (v2703(VarCurr,bitIndex6)<->v2664(VarCurr,bitIndex5))& (v2703(VarCurr,bitIndex5)<->v2664(VarCurr,bitIndex4))& (v2703(VarCurr,bitIndex4)<->v2664(VarCurr,bitIndex3))& (v2703(VarCurr,bitIndex3)<->v2664(VarCurr,bitIndex2))& (v2703(VarCurr,bitIndex2)<->v2664(VarCurr,bitIndex1))& (v2703(VarCurr,bitIndex1)<->v2664(VarCurr,bitIndex0))).
% 93.91/93.30  all VarCurr B (range_16_0(B)-> (v2663(VarCurr,B)<->v2664(VarCurr,B)&v2665(VarCurr,B))).
% 93.91/93.30  all B (range_16_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex0)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex1)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex2)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex3)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex4)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex5)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex6)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex7)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex8)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex9)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex10)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex11)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex12)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex13)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex14)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex15)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (v2665(VarCurr,bitIndex16)<->v2666(VarCurr)).
% 93.91/93.30  all VarCurr (-v2666(VarCurr)<->v2667(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr (v2667(VarCurr,bitIndex0)<->v2676(VarCurr)).
% 93.91/93.30  all VarCurr (v2667(VarCurr,bitIndex1)<->v2698(VarCurr)).
% 93.91/93.30  all VarCurr (v2667(VarCurr,bitIndex2)<->v2693(VarCurr)).
% 93.91/93.30  all VarCurr (v2667(VarCurr,bitIndex3)<->v2669(VarCurr)).
% 93.91/93.30  all VarCurr (v2698(VarCurr)<->v2699(VarCurr)&v2701(VarCurr)).
% 93.91/93.30  all VarCurr (v2701(VarCurr)<->v2652(VarCurr,bitIndex0)|v2688(VarCurr)).
% 93.91/93.30  all VarCurr (v2699(VarCurr)<->v2676(VarCurr)|v2700(VarCurr)).
% 93.91/93.30  all VarCurr (-v2700(VarCurr)<->v2688(VarCurr)).
% 93.91/93.30  all VarCurr (v2693(VarCurr)<->v2694(VarCurr)&v2697(VarCurr)).
% 93.91/93.30  all VarCurr (v2697(VarCurr)<->v2683(VarCurr)|v2687(VarCurr)).
% 93.91/93.30  all VarCurr (v2694(VarCurr)<->v2695(VarCurr)|v2696(VarCurr)).
% 93.91/93.30  all VarCurr (-v2696(VarCurr)<->v2687(VarCurr)).
% 93.91/93.30  all VarCurr (-v2695(VarCurr)<->v2683(VarCurr)).
% 93.91/93.30  all VarCurr (v2669(VarCurr)<->v2670(VarCurr)&v2691(VarCurr)).
% 93.91/93.30  all VarCurr (v2691(VarCurr)<->v2692(VarCurr)|v2682(VarCurr)).
% 93.91/93.30  all VarCurr (-v2692(VarCurr)<->v2671(VarCurr)).
% 93.91/93.30  all VarCurr (v2670(VarCurr)<->v2671(VarCurr)|v2681(VarCurr)).
% 93.91/93.30  all VarCurr (-v2681(VarCurr)<->v2682(VarCurr)).
% 93.91/93.30  all VarCurr (v2682(VarCurr)<->v2683(VarCurr)&v2687(VarCurr)).
% 93.91/93.30  all VarCurr (v2687(VarCurr)<->v2652(VarCurr,bitIndex0)&v2688(VarCurr)).
% 93.91/93.30  all VarCurr (v2688(VarCurr)<->v2689(VarCurr)&v2690(VarCurr)).
% 93.91/93.30  all VarCurr (v2690(VarCurr)<->v2676(VarCurr)|v2677(VarCurr)).
% 93.91/93.30  all VarCurr (v2689(VarCurr)<->v2652(VarCurr,bitIndex0)|v2652(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (v2683(VarCurr)<->v2684(VarCurr)&v2686(VarCurr)).
% 93.91/93.30  all VarCurr (v2686(VarCurr)<->v2675(VarCurr)|v2678(VarCurr)).
% 93.91/93.30  all VarCurr (v2684(VarCurr)<->v2685(VarCurr)|v2652(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (-v2685(VarCurr)<->v2675(VarCurr)).
% 93.91/93.30  all VarCurr (v2671(VarCurr)<->v2672(VarCurr)&v2679(VarCurr)).
% 93.91/93.30  all VarCurr (v2679(VarCurr)<->v2674(VarCurr)|v2680(VarCurr)).
% 93.91/93.30  all VarCurr (-v2680(VarCurr)<->v2652(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (v2672(VarCurr)<->v2673(VarCurr)|v2652(VarCurr,bitIndex3)).
% 93.91/93.30  all VarCurr (-v2673(VarCurr)<->v2674(VarCurr)).
% 93.91/93.30  all VarCurr (v2674(VarCurr)<->v2675(VarCurr)&v2678(VarCurr)).
% 93.91/93.30  all VarCurr (-v2678(VarCurr)<->v2652(VarCurr,bitIndex2)).
% 93.91/93.30  all VarCurr (v2675(VarCurr)<->v2676(VarCurr)&v2677(VarCurr)).
% 93.91/93.30  all VarCurr (-v2677(VarCurr)<->v2652(VarCurr,bitIndex1)).
% 93.91/93.30  all VarCurr (-v2676(VarCurr)<->v2652(VarCurr,bitIndex0)).
% 93.91/93.30  all VarCurr B (range_7_0(B)-> (v2664(VarCurr,B)<->$T)).
% 93.91/93.30  all B (range_7_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B).
% 93.91/93.30  b11111111(bitIndex7).
% 93.91/93.30  b11111111(bitIndex6).
% 93.91/93.30  b11111111(bitIndex5).
% 93.91/93.30  b11111111(bitIndex4).
% 93.91/93.30  b11111111(bitIndex3).
% 93.91/93.30  b11111111(bitIndex2).
% 93.91/93.30  b11111111(bitIndex1).
% 93.91/93.30  b11111111(bitIndex0).
% 93.91/93.30  all VarCurr ((v2664(VarCurr,bitIndex16)<->v2449(VarCurr,bitIndex8))& (v2664(VarCurr,bitIndex15)<->v2449(VarCurr,bitIndex7))& (v2664(VarCurr,bitIndex14)<->v2449(VarCurr,bitIndex6))& (v2664(VarCurr,bitIndex13)<->v2449(VarCurr,bitIndex5))& (v2664(VarCurr,bitIndex12)<->v2449(VarCurr,bitIndex4))& (v2664(VarCurr,bitIndex11)<->v2449(VarCurr,bitIndex3))& (v2664(VarCurr,bitIndex10)<->v2449(VarCurr,bitIndex2))& (v2664(VarCurr,bitIndex9)<->v2449(VarCurr,bitIndex1))& (v2664(VarCurr,bitIndex8)<->v2449(VarCurr,bitIndex0))).
% 93.91/93.30  all VarCurr B (range_3_0(B)-> (v2652(VarCurr,B)<->v2654(VarCurr,B))).
% 93.91/93.30  all VarCurr ((v2654(VarCurr,bitIndex3)<->v149(VarCurr,bitIndex8))& (v2654(VarCurr,bitIndex2)<->v149(VarCurr,bitIndex7))& (v2654(VarCurr,bitIndex1)<->v149(VarCurr,bitIndex6))& (v2654(VarCurr,bitIndex0)<->v149(VarCurr,bitIndex5))).
% 93.91/93.30  all VarCurr B (range_8_5(B)-> (v149(VarCurr,B)<->v151(VarCurr,B))).
% 93.91/93.30  all VarCurr B (range_8_5(B)-> (v151(VarCurr,B)<->v156(VarCurr,B))).
% 93.91/93.31  all B (range_8_5(B)<->bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex8)<->v2454(VarCurr,bitIndex8)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex7)<->v2454(VarCurr,bitIndex7)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex6)<->v2454(VarCurr,bitIndex6)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex5)<->v2454(VarCurr,bitIndex5)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex4)<->v2454(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex3)<->v2454(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex2)<->v2454(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex1)<->v2454(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2449(VarCurr,bitIndex0)<->v2454(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr B (range_39_0(B)-> (v2454(VarCurr,B)<->v2456(VarCurr,B)|v2556(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_39_0(B)-> (v2556(VarCurr,B)<->v2557(VarCurr,B)&v2649(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex38)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2649(VarCurr,bitIndex39)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr B (range_23_0(B)-> (v2557(VarCurr,B)<->v2559(VarCurr,B))).
% 93.91/93.31  all VarCurr ((v2557(VarCurr,bitIndex39)<->$F)& (v2557(VarCurr,bitIndex38)<->$F)& (v2557(VarCurr,bitIndex37)<->$F)& (v2557(VarCurr,bitIndex36)<->$F)& (v2557(VarCurr,bitIndex35)<->$F)& (v2557(VarCurr,bitIndex34)<->$F)& (v2557(VarCurr,bitIndex33)<->$F)& (v2557(VarCurr,bitIndex32)<->$F)& (v2557(VarCurr,bitIndex31)<->$F)& (v2557(VarCurr,bitIndex30)<->$F)& (v2557(VarCurr,bitIndex29)<->$F)& (v2557(VarCurr,bitIndex28)<->$F)& (v2557(VarCurr,bitIndex27)<->$F)& (v2557(VarCurr,bitIndex26)<->$F)& (v2557(VarCurr,bitIndex25)<->$F)& (v2557(VarCurr,bitIndex24)<->$F)).
% 93.91/93.31  -b0000000000000000(bitIndex15).
% 93.91/93.31  -b0000000000000000(bitIndex14).
% 93.91/93.31  -b0000000000000000(bitIndex13).
% 93.91/93.31  -b0000000000000000(bitIndex12).
% 93.91/93.31  -b0000000000000000(bitIndex11).
% 93.91/93.31  -b0000000000000000(bitIndex10).
% 93.91/93.31  -b0000000000000000(bitIndex9).
% 93.91/93.31  -b0000000000000000(bitIndex8).
% 93.91/93.31  -b0000000000000000(bitIndex7).
% 93.91/93.31  -b0000000000000000(bitIndex6).
% 93.91/93.31  -b0000000000000000(bitIndex5).
% 93.91/93.31  -b0000000000000000(bitIndex4).
% 93.91/93.31  -b0000000000000000(bitIndex3).
% 93.91/93.31  -b0000000000000000(bitIndex2).
% 93.91/93.31  -b0000000000000000(bitIndex1).
% 93.91/93.31  -b0000000000000000(bitIndex0).
% 93.91/93.31  all VarCurr B (range_23_0(B)-> (v2559(VarCurr,B)<->v2560(VarCurr,B)|v2604(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_23_0(B)-> (v2604(VarCurr,B)<->v2605(VarCurr,B)&v2648(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2648(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2605(VarCurr,B)<->v2606(VarCurr,B))).
% 93.91/93.31  all VarCurr ((v2605(VarCurr,bitIndex23)<->$F)& (v2605(VarCurr,bitIndex22)<->$F)& (v2605(VarCurr,bitIndex21)<->$F)& (v2605(VarCurr,bitIndex20)<->$F)& (v2605(VarCurr,bitIndex19)<->$F)& (v2605(VarCurr,bitIndex18)<->$F)& (v2605(VarCurr,bitIndex17)<->$F)& (v2605(VarCurr,bitIndex16)<->$F)).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2606(VarCurr,B)<->v2607(VarCurr,B)|v2627(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2627(VarCurr,B)<->v2628(VarCurr,B)&v2647(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2647(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2628(VarCurr,B)<->v2629(VarCurr,B))).
% 93.91/93.31  all VarCurr ((v2628(VarCurr,bitIndex15)<->$F)& (v2628(VarCurr,bitIndex14)<->$F)& (v2628(VarCurr,bitIndex13)<->$F)& (v2628(VarCurr,bitIndex12)<->$F)).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2629(VarCurr,B)<->v2630(VarCurr,B)|v2638(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2638(VarCurr,B)<->v2639(VarCurr,B)&v2646(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2646(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr B (range_9_0(B)-> (v2639(VarCurr,B)<->v2640(VarCurr,B))).
% 93.91/93.31  all VarCurr ((v2639(VarCurr,bitIndex11)<->$F)& (v2639(VarCurr,bitIndex10)<->$F)).
% 93.91/93.31  all VarCurr B (range_9_0(B)-> (v2640(VarCurr,B)<->v2641(VarCurr,B)|v2643(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_9_0(B)-> (v2643(VarCurr,B)<->v2644(VarCurr,B)&v2645(VarCurr,B))).
% 93.91/93.31  all B (range_9_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2645(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr ((v2644(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex39))& (v2644(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex38))& (v2644(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex37))& (v2644(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex36))& (v2644(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex35))& (v2644(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex34))& (v2644(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex33))& (v2644(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex32))& (v2644(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex31))).
% 93.91/93.31  all VarCurr (v2644(VarCurr,bitIndex9)<->$F).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex30)&v2642(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex31)&v2642(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex32)&v2642(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex33)&v2642(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex34)&v2642(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex35)&v2642(VarCurr,bitIndex5)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex36)&v2642(VarCurr,bitIndex6)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex37)&v2642(VarCurr,bitIndex7)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex38)&v2642(VarCurr,bitIndex8)).
% 93.91/93.31  all VarCurr (v2641(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex39)&v2642(VarCurr,bitIndex9)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2642(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2630(VarCurr,B)<->v2631(VarCurr,B)&v2637(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr (v2637(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2631(VarCurr,B)<->v2632(VarCurr,B)|v2634(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_11_0(B)-> (v2634(VarCurr,B)<->v2635(VarCurr,B)&v2636(VarCurr,B))).
% 93.91/93.31  all B (range_11_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2636(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr ((v2635(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex39))& (v2635(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex38))& (v2635(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex37))& (v2635(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex36))& (v2635(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex35))& (v2635(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex34))& (v2635(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex33))& (v2635(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex32))& (v2635(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex31))& (v2635(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex30))& (v2635(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex29))).
% 93.91/93.31  all VarCurr (v2635(VarCurr,bitIndex11)<->$F).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex28)&v2633(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex29)&v2633(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex30)&v2633(VarCurr,bitIndex2)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex31)&v2633(VarCurr,bitIndex3)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex32)&v2633(VarCurr,bitIndex4)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex33)&v2633(VarCurr,bitIndex5)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex34)&v2633(VarCurr,bitIndex6)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex35)&v2633(VarCurr,bitIndex7)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex36)&v2633(VarCurr,bitIndex8)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex37)&v2633(VarCurr,bitIndex9)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex38)&v2633(VarCurr,bitIndex10)).
% 93.91/93.31  all VarCurr (v2632(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex39)&v2633(VarCurr,bitIndex11)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr (v2633(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2607(VarCurr,B)<->v2608(VarCurr,B)&v2626(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex0)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex1)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex2)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex3)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex4)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex5)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex6)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex7)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex8)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex9)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex10)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex11)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex12)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex13)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex14)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr (v2626(VarCurr,bitIndex15)<->v2484(VarCurr)).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2608(VarCurr,B)<->v2609(VarCurr,B)|v2617(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_15_0(B)-> (v2617(VarCurr,B)<->v2618(VarCurr,B)&v2625(VarCurr,B))).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr (v2625(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.31  all VarCurr B (range_13_0(B)-> (v2618(VarCurr,B)<->v2619(VarCurr,B))).
% 93.91/93.31  all VarCurr ((v2618(VarCurr,bitIndex15)<->$F)& (v2618(VarCurr,bitIndex14)<->$F)).
% 93.91/93.31  all VarCurr B (range_13_0(B)-> (v2619(VarCurr,B)<->v2620(VarCurr,B)|v2622(VarCurr,B))).
% 93.91/93.31  all VarCurr B (range_13_0(B)-> (v2622(VarCurr,B)<->v2623(VarCurr,B)&v2624(VarCurr,B))).
% 93.91/93.31  all B (range_13_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr (v2624(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.31  all VarCurr ((v2623(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex39))& (v2623(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex38))& (v2623(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex37))& (v2623(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex36))& (v2623(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex35))& (v2623(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex34))& (v2623(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex33))& (v2623(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex32))& (v2623(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex31))& (v2623(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex30))& (v2623(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex29))& (v2623(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex28))& (v2623(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex27))).
% 93.91/93.31  all VarCurr (v2623(VarCurr,bitIndex13)<->$F).
% 93.91/93.31  all VarCurr (v2620(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex26)&v2621(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex27)&v2621(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex28)&v2621(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex29)&v2621(VarCurr,bitIndex3)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex30)&v2621(VarCurr,bitIndex4)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex31)&v2621(VarCurr,bitIndex5)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex32)&v2621(VarCurr,bitIndex6)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex33)&v2621(VarCurr,bitIndex7)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex34)&v2621(VarCurr,bitIndex8)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex35)&v2621(VarCurr,bitIndex9)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex36)&v2621(VarCurr,bitIndex10)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex37)&v2621(VarCurr,bitIndex11)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex38)&v2621(VarCurr,bitIndex12)).
% 93.91/93.32  all VarCurr (v2620(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex39)&v2621(VarCurr,bitIndex13)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2621(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr B (range_15_0(B)-> (v2609(VarCurr,B)<->v2610(VarCurr,B)&v2616(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2616(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr B (range_15_0(B)-> (v2610(VarCurr,B)<->v2611(VarCurr,B)|v2613(VarCurr,B))).
% 93.91/93.32  all VarCurr B (range_15_0(B)-> (v2613(VarCurr,B)<->v2614(VarCurr,B)&v2615(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2615(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr ((v2614(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex39))& (v2614(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex38))& (v2614(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex37))& (v2614(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex36))& (v2614(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex35))& (v2614(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex34))& (v2614(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex33))& (v2614(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex32))& (v2614(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex31))& (v2614(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex30))& (v2614(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex29))& (v2614(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex28))& (v2614(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex27))& (v2614(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex26))& (v2614(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex25))).
% 93.91/93.32  all VarCurr (v2614(VarCurr,bitIndex15)<->$F).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex24)&v2612(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex25)&v2612(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex26)&v2612(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex27)&v2612(VarCurr,bitIndex3)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex28)&v2612(VarCurr,bitIndex4)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex29)&v2612(VarCurr,bitIndex5)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex30)&v2612(VarCurr,bitIndex6)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex31)&v2612(VarCurr,bitIndex7)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex32)&v2612(VarCurr,bitIndex8)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex33)&v2612(VarCurr,bitIndex9)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex34)&v2612(VarCurr,bitIndex10)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex35)&v2612(VarCurr,bitIndex11)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex36)&v2612(VarCurr,bitIndex12)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex37)&v2612(VarCurr,bitIndex13)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex38)&v2612(VarCurr,bitIndex14)).
% 93.91/93.32  all VarCurr (v2611(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex39)&v2612(VarCurr,bitIndex15)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2612(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr B (range_23_0(B)-> (v2560(VarCurr,B)<->v2561(VarCurr,B)&v2603(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex0)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex1)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex2)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex3)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex4)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex5)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex6)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex7)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex8)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex9)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex10)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex11)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex12)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex13)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex14)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex15)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex16)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex17)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex18)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex19)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex20)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex21)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex22)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr (v2603(VarCurr,bitIndex23)<->v2507(VarCurr)).
% 93.91/93.32  all VarCurr B (range_23_0(B)-> (v2561(VarCurr,B)<->v2562(VarCurr,B)|v2582(VarCurr,B))).
% 93.91/93.32  all VarCurr B (range_23_0(B)-> (v2582(VarCurr,B)<->v2583(VarCurr,B)&v2602(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2602(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2583(VarCurr,B)<->v2584(VarCurr,B))).
% 93.91/93.32  all VarCurr ((v2583(VarCurr,bitIndex23)<->$F)& (v2583(VarCurr,bitIndex22)<->$F)& (v2583(VarCurr,bitIndex21)<->$F)& (v2583(VarCurr,bitIndex20)<->$F)).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2584(VarCurr,B)<->v2585(VarCurr,B)|v2593(VarCurr,B))).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2593(VarCurr,B)<->v2594(VarCurr,B)&v2601(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2601(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr B (range_17_0(B)-> (v2594(VarCurr,B)<->v2595(VarCurr,B))).
% 93.91/93.32  all VarCurr ((v2594(VarCurr,bitIndex19)<->$F)& (v2594(VarCurr,bitIndex18)<->$F)).
% 93.91/93.32  all VarCurr B (range_17_0(B)-> (v2595(VarCurr,B)<->v2596(VarCurr,B)|v2598(VarCurr,B))).
% 93.91/93.32  all VarCurr B (range_17_0(B)-> (v2598(VarCurr,B)<->v2599(VarCurr,B)&v2600(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2600(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr ((v2599(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex39))& (v2599(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex38))& (v2599(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex37))& (v2599(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex36))& (v2599(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex35))& (v2599(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex34))& (v2599(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex33))& (v2599(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex32))& (v2599(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex31))& (v2599(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex30))& (v2599(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex29))& (v2599(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex28))& (v2599(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex27))& (v2599(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex26))& (v2599(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex25))& (v2599(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex24))& (v2599(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex23))).
% 93.91/93.32  all VarCurr (v2599(VarCurr,bitIndex17)<->$F).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex22)&v2597(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex23)&v2597(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex24)&v2597(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex25)&v2597(VarCurr,bitIndex3)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex26)&v2597(VarCurr,bitIndex4)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex27)&v2597(VarCurr,bitIndex5)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex28)&v2597(VarCurr,bitIndex6)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex29)&v2597(VarCurr,bitIndex7)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex30)&v2597(VarCurr,bitIndex8)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex31)&v2597(VarCurr,bitIndex9)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex32)&v2597(VarCurr,bitIndex10)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex33)&v2597(VarCurr,bitIndex11)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex34)&v2597(VarCurr,bitIndex12)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex35)&v2597(VarCurr,bitIndex13)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex36)&v2597(VarCurr,bitIndex14)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex37)&v2597(VarCurr,bitIndex15)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex38)&v2597(VarCurr,bitIndex16)).
% 93.91/93.32  all VarCurr (v2596(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex39)&v2597(VarCurr,bitIndex17)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr (v2597(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2585(VarCurr,B)<->v2586(VarCurr,B)&v2592(VarCurr,B))).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr (v2592(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2586(VarCurr,B)<->v2587(VarCurr,B)|v2589(VarCurr,B))).
% 93.91/93.32  all VarCurr B (range_19_0(B)-> (v2589(VarCurr,B)<->v2590(VarCurr,B)&v2591(VarCurr,B))).
% 93.91/93.32  all B (range_19_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2591(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr ((v2590(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex39))& (v2590(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex38))& (v2590(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex37))& (v2590(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex36))& (v2590(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex35))& (v2590(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex34))& (v2590(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex33))& (v2590(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex32))& (v2590(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex31))& (v2590(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex30))& (v2590(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex29))& (v2590(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex28))& (v2590(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex27))& (v2590(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex26))& (v2590(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex25))& (v2590(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex24))& (v2590(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex23))& (v2590(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex22))& (v2590(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex21))).
% 93.91/93.32  all VarCurr (v2590(VarCurr,bitIndex19)<->$F).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex20)&v2588(VarCurr,bitIndex0)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex21)&v2588(VarCurr,bitIndex1)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex22)&v2588(VarCurr,bitIndex2)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex23)&v2588(VarCurr,bitIndex3)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex24)&v2588(VarCurr,bitIndex4)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex25)&v2588(VarCurr,bitIndex5)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex26)&v2588(VarCurr,bitIndex6)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex27)&v2588(VarCurr,bitIndex7)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex28)&v2588(VarCurr,bitIndex8)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex29)&v2588(VarCurr,bitIndex9)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex30)&v2588(VarCurr,bitIndex10)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex31)&v2588(VarCurr,bitIndex11)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex32)&v2588(VarCurr,bitIndex12)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex33)&v2588(VarCurr,bitIndex13)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex34)&v2588(VarCurr,bitIndex14)).
% 93.91/93.32  all VarCurr (v2587(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex35)&v2588(VarCurr,bitIndex15)).
% 93.91/93.33  all VarCurr (v2587(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex36)&v2588(VarCurr,bitIndex16)).
% 93.91/93.33  all VarCurr (v2587(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex37)&v2588(VarCurr,bitIndex17)).
% 93.91/93.33  all VarCurr (v2587(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex38)&v2588(VarCurr,bitIndex18)).
% 93.91/93.33  all VarCurr (v2587(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex39)&v2588(VarCurr,bitIndex19)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2588(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2562(VarCurr,B)<->v2563(VarCurr,B)&v2581(VarCurr,B))).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex0)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex1)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex2)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex3)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex4)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex5)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex6)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex7)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex8)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex9)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex10)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex11)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex12)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex13)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex14)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex15)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex16)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex17)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex18)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex19)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex20)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex21)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex22)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr (v2581(VarCurr,bitIndex23)<->v2484(VarCurr)).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2563(VarCurr,B)<->v2564(VarCurr,B)|v2572(VarCurr,B))).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2572(VarCurr,B)<->v2573(VarCurr,B)&v2580(VarCurr,B))).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2580(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr B (range_21_0(B)-> (v2573(VarCurr,B)<->v2574(VarCurr,B))).
% 93.91/93.33  all VarCurr ((v2573(VarCurr,bitIndex23)<->$F)& (v2573(VarCurr,bitIndex22)<->$F)).
% 93.91/93.33  all VarCurr B (range_21_0(B)-> (v2574(VarCurr,B)<->v2575(VarCurr,B)|v2577(VarCurr,B))).
% 93.91/93.33  all VarCurr B (range_21_0(B)-> (v2577(VarCurr,B)<->v2578(VarCurr,B)&v2579(VarCurr,B))).
% 93.91/93.33  all B (range_21_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2579(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr ((v2578(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex39))& (v2578(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex38))& (v2578(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex37))& (v2578(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex36))& (v2578(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex35))& (v2578(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex34))& (v2578(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex33))& (v2578(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex32))& (v2578(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex31))& (v2578(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex30))& (v2578(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex29))& (v2578(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex28))& (v2578(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex27))& (v2578(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex26))& (v2578(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex25))& (v2578(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex24))& (v2578(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex23))& (v2578(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex22))& (v2578(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex21))& (v2578(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex20))& (v2578(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex19))).
% 93.91/93.33  all VarCurr (v2578(VarCurr,bitIndex21)<->$F).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex18)&v2576(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex19)&v2576(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex20)&v2576(VarCurr,bitIndex2)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex21)&v2576(VarCurr,bitIndex3)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex22)&v2576(VarCurr,bitIndex4)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex23)&v2576(VarCurr,bitIndex5)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex24)&v2576(VarCurr,bitIndex6)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex25)&v2576(VarCurr,bitIndex7)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex26)&v2576(VarCurr,bitIndex8)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex27)&v2576(VarCurr,bitIndex9)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex28)&v2576(VarCurr,bitIndex10)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex29)&v2576(VarCurr,bitIndex11)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex30)&v2576(VarCurr,bitIndex12)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex31)&v2576(VarCurr,bitIndex13)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex32)&v2576(VarCurr,bitIndex14)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex33)&v2576(VarCurr,bitIndex15)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex34)&v2576(VarCurr,bitIndex16)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex35)&v2576(VarCurr,bitIndex17)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex36)&v2576(VarCurr,bitIndex18)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex37)&v2576(VarCurr,bitIndex19)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex38)&v2576(VarCurr,bitIndex20)).
% 93.91/93.33  all VarCurr (v2575(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex39)&v2576(VarCurr,bitIndex21)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2576(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2564(VarCurr,B)<->v2565(VarCurr,B)&v2571(VarCurr,B))).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex20)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex21)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex22)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr (v2571(VarCurr,bitIndex23)<->v2473(VarCurr)).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2565(VarCurr,B)<->v2566(VarCurr,B)|v2568(VarCurr,B))).
% 93.91/93.33  all VarCurr B (range_23_0(B)-> (v2568(VarCurr,B)<->v2569(VarCurr,B)&v2570(VarCurr,B))).
% 93.91/93.33  all B (range_23_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2570(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr ((v2569(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex39))& (v2569(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex38))& (v2569(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex37))& (v2569(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex36))& (v2569(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex35))& (v2569(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex34))& (v2569(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex33))& (v2569(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex32))& (v2569(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex31))& (v2569(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex30))& (v2569(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex29))& (v2569(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex28))& (v2569(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex27))& (v2569(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex26))& (v2569(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex25))& (v2569(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex24))& (v2569(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex23))& (v2569(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex22))& (v2569(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex21))& (v2569(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex20))& (v2569(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex19))& (v2569(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex18))& (v2569(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex17))).
% 93.91/93.33  all VarCurr (v2569(VarCurr,bitIndex23)<->$F).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex16)&v2567(VarCurr,bitIndex0)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex17)&v2567(VarCurr,bitIndex1)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex18)&v2567(VarCurr,bitIndex2)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex19)&v2567(VarCurr,bitIndex3)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex20)&v2567(VarCurr,bitIndex4)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex21)&v2567(VarCurr,bitIndex5)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex22)&v2567(VarCurr,bitIndex6)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex23)&v2567(VarCurr,bitIndex7)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex24)&v2567(VarCurr,bitIndex8)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex25)&v2567(VarCurr,bitIndex9)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex26)&v2567(VarCurr,bitIndex10)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex27)&v2567(VarCurr,bitIndex11)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex28)&v2567(VarCurr,bitIndex12)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex29)&v2567(VarCurr,bitIndex13)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex30)&v2567(VarCurr,bitIndex14)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex31)&v2567(VarCurr,bitIndex15)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex32)&v2567(VarCurr,bitIndex16)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex33)&v2567(VarCurr,bitIndex17)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex34)&v2567(VarCurr,bitIndex18)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex35)&v2567(VarCurr,bitIndex19)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex36)&v2567(VarCurr,bitIndex20)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex37)&v2567(VarCurr,bitIndex21)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex38)&v2567(VarCurr,bitIndex22)).
% 93.91/93.33  all VarCurr (v2566(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex39)&v2567(VarCurr,bitIndex23)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr (v2567(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 93.91/93.33  all VarCurr B (range_39_0(B)-> (v2456(VarCurr,B)<->v2457(VarCurr,B)&v2554(VarCurr,B))).
% 93.91/93.33  all VarCurr (v2554(VarCurr,bitIndex0)<->v2555(VarCurr)).
% 93.91/93.33  all VarCurr (v2554(VarCurr,bitIndex1)<->v2555(VarCurr)).
% 93.91/93.33  all VarCurr (v2554(VarCurr,bitIndex2)<->v2555(VarCurr)).
% 93.91/93.33  all VarCurr (v2554(VarCurr,bitIndex3)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex4)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex5)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex6)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex7)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex8)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex9)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex10)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex11)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex12)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex13)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex14)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex15)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex16)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex17)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex18)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex19)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex20)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex21)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex22)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex23)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex24)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex25)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex26)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex27)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex28)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex29)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex30)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex31)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex32)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex33)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex34)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex35)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex36)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex37)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex38)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (v2554(VarCurr,bitIndex39)<->v2555(VarCurr)).
% 93.91/93.34  all VarCurr (-v2555(VarCurr)<->v2453(VarCurr,bitIndex4)).
% 93.91/93.34  all VarCurr B (range_39_0(B)-> (v2457(VarCurr,B)<->v2458(VarCurr,B)|v2508(VarCurr,B))).
% 93.91/93.34  all VarCurr B (range_39_0(B)-> (v2508(VarCurr,B)<->v2509(VarCurr,B)&v2553(VarCurr,B))).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex38)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2553(VarCurr,bitIndex39)<->v2453(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr B (range_31_0(B)-> (v2509(VarCurr,B)<->v2511(VarCurr,B))).
% 93.91/93.34  all VarCurr ((v2509(VarCurr,bitIndex39)<->$F)& (v2509(VarCurr,bitIndex38)<->$F)& (v2509(VarCurr,bitIndex37)<->$F)& (v2509(VarCurr,bitIndex36)<->$F)& (v2509(VarCurr,bitIndex35)<->$F)& (v2509(VarCurr,bitIndex34)<->$F)& (v2509(VarCurr,bitIndex33)<->$F)& (v2509(VarCurr,bitIndex32)<->$F)).
% 93.91/93.34  -b00000000(bitIndex7).
% 93.91/93.34  -b00000000(bitIndex6).
% 93.91/93.34  -b00000000(bitIndex5).
% 93.91/93.34  -b00000000(bitIndex4).
% 93.91/93.34  -b00000000(bitIndex3).
% 93.91/93.34  -b00000000(bitIndex2).
% 93.91/93.34  -b00000000(bitIndex1).
% 93.91/93.34  -b00000000(bitIndex0).
% 93.91/93.34  all VarCurr B (range_31_0(B)-> (v2511(VarCurr,B)<->v2512(VarCurr,B)|v2532(VarCurr,B))).
% 93.91/93.34  all VarCurr B (range_31_0(B)-> (v2532(VarCurr,B)<->v2533(VarCurr,B)&v2552(VarCurr,B))).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2552(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr B (range_27_0(B)-> (v2533(VarCurr,B)<->v2534(VarCurr,B))).
% 93.91/93.34  all VarCurr ((v2533(VarCurr,bitIndex31)<->$F)& (v2533(VarCurr,bitIndex30)<->$F)& (v2533(VarCurr,bitIndex29)<->$F)& (v2533(VarCurr,bitIndex28)<->$F)).
% 93.91/93.34  all VarCurr B (range_27_0(B)-> (v2534(VarCurr,B)<->v2535(VarCurr,B)|v2543(VarCurr,B))).
% 93.91/93.34  all VarCurr B (range_27_0(B)-> (v2543(VarCurr,B)<->v2544(VarCurr,B)&v2551(VarCurr,B))).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2551(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr B (range_25_0(B)-> (v2544(VarCurr,B)<->v2545(VarCurr,B))).
% 93.91/93.34  all VarCurr ((v2544(VarCurr,bitIndex27)<->$F)& (v2544(VarCurr,bitIndex26)<->$F)).
% 93.91/93.34  all VarCurr B (range_25_0(B)-> (v2545(VarCurr,B)<->v2546(VarCurr,B)|v2548(VarCurr,B))).
% 93.91/93.34  all VarCurr B (range_25_0(B)-> (v2548(VarCurr,B)<->v2549(VarCurr,B)&v2550(VarCurr,B))).
% 93.91/93.34  all B (range_25_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2550(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr ((v2549(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex39))& (v2549(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex38))& (v2549(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex37))& (v2549(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex36))& (v2549(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex35))& (v2549(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex34))& (v2549(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex33))& (v2549(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex32))& (v2549(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex31))& (v2549(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex30))& (v2549(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex29))& (v2549(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex28))& (v2549(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex27))& (v2549(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex26))& (v2549(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex25))& (v2549(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex24))& (v2549(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex23))& (v2549(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex22))& (v2549(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex21))& (v2549(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex20))& (v2549(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex19))& (v2549(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex18))& (v2549(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex17))& (v2549(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex16))& (v2549(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex15))).
% 93.91/93.34  all VarCurr (v2549(VarCurr,bitIndex25)<->$F).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex14)&v2547(VarCurr,bitIndex0)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex15)&v2547(VarCurr,bitIndex1)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex16)&v2547(VarCurr,bitIndex2)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex17)&v2547(VarCurr,bitIndex3)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex18)&v2547(VarCurr,bitIndex4)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex19)&v2547(VarCurr,bitIndex5)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex20)&v2547(VarCurr,bitIndex6)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex21)&v2547(VarCurr,bitIndex7)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex22)&v2547(VarCurr,bitIndex8)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex23)&v2547(VarCurr,bitIndex9)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex24)&v2547(VarCurr,bitIndex10)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex25)&v2547(VarCurr,bitIndex11)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex26)&v2547(VarCurr,bitIndex12)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex27)&v2547(VarCurr,bitIndex13)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex28)&v2547(VarCurr,bitIndex14)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex29)&v2547(VarCurr,bitIndex15)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex30)&v2547(VarCurr,bitIndex16)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex31)&v2547(VarCurr,bitIndex17)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex32)&v2547(VarCurr,bitIndex18)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex33)&v2547(VarCurr,bitIndex19)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex34)&v2547(VarCurr,bitIndex20)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex35)&v2547(VarCurr,bitIndex21)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex36)&v2547(VarCurr,bitIndex22)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex37)&v2547(VarCurr,bitIndex23)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex38)&v2547(VarCurr,bitIndex24)).
% 93.91/93.34  all VarCurr (v2546(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex39)&v2547(VarCurr,bitIndex25)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr (v2547(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 93.91/93.34  all VarCurr B (range_27_0(B)-> (v2535(VarCurr,B)<->v2536(VarCurr,B)&v2542(VarCurr,B))).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex20)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex21)<->v2473(VarCurr)).
% 93.91/93.34  all VarCurr (v2542(VarCurr,bitIndex22)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr (v2542(VarCurr,bitIndex23)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr (v2542(VarCurr,bitIndex24)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr (v2542(VarCurr,bitIndex25)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr (v2542(VarCurr,bitIndex26)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr (v2542(VarCurr,bitIndex27)<->v2473(VarCurr)).
% 93.91/93.35  all VarCurr B (range_27_0(B)-> (v2536(VarCurr,B)<->v2537(VarCurr,B)|v2539(VarCurr,B))).
% 93.91/93.35  all VarCurr B (range_27_0(B)-> (v2539(VarCurr,B)<->v2540(VarCurr,B)&v2541(VarCurr,B))).
% 93.91/93.35  all B (range_27_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2541(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr ((v2540(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex39))& (v2540(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex38))& (v2540(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex37))& (v2540(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex36))& (v2540(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex35))& (v2540(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex34))& (v2540(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex33))& (v2540(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex32))& (v2540(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex31))& (v2540(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex30))& (v2540(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex29))& (v2540(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex28))& (v2540(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex27))& (v2540(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex26))& (v2540(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex25))& (v2540(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex24))& (v2540(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex23))& (v2540(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex22))& (v2540(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex21))& (v2540(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex20))& (v2540(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex19))& (v2540(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex18))& (v2540(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex17))& (v2540(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex16))& (v2540(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex15))& (v2540(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex14))& (v2540(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex13))).
% 93.91/93.35  all VarCurr (v2540(VarCurr,bitIndex27)<->$F).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex12)&v2538(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex13)&v2538(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex14)&v2538(VarCurr,bitIndex2)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex15)&v2538(VarCurr,bitIndex3)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex16)&v2538(VarCurr,bitIndex4)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex17)&v2538(VarCurr,bitIndex5)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex18)&v2538(VarCurr,bitIndex6)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex19)&v2538(VarCurr,bitIndex7)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex20)&v2538(VarCurr,bitIndex8)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex21)&v2538(VarCurr,bitIndex9)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex22)&v2538(VarCurr,bitIndex10)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex23)&v2538(VarCurr,bitIndex11)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex24)&v2538(VarCurr,bitIndex12)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex25)&v2538(VarCurr,bitIndex13)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex26)&v2538(VarCurr,bitIndex14)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex27)&v2538(VarCurr,bitIndex15)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex28)&v2538(VarCurr,bitIndex16)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex29)&v2538(VarCurr,bitIndex17)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex30)&v2538(VarCurr,bitIndex18)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex31)&v2538(VarCurr,bitIndex19)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex32)&v2538(VarCurr,bitIndex20)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex33)&v2538(VarCurr,bitIndex21)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex34)&v2538(VarCurr,bitIndex22)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex35)&v2538(VarCurr,bitIndex23)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex36)&v2538(VarCurr,bitIndex24)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex37)&v2538(VarCurr,bitIndex25)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex38)&v2538(VarCurr,bitIndex26)).
% 93.91/93.35  all VarCurr (v2537(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex39)&v2538(VarCurr,bitIndex27)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2538(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr B (range_31_0(B)-> (v2512(VarCurr,B)<->v2513(VarCurr,B)&v2531(VarCurr,B))).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex0)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex1)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex2)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex3)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex4)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex5)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex6)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex7)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex8)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex9)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex10)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex11)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex12)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex13)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex14)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex15)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex16)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex17)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex18)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex19)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex20)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex21)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex22)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex23)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex24)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex25)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex26)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex27)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex28)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex29)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex30)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr (v2531(VarCurr,bitIndex31)<->v2484(VarCurr)).
% 93.91/93.35  all VarCurr B (range_31_0(B)-> (v2513(VarCurr,B)<->v2514(VarCurr,B)|v2522(VarCurr,B))).
% 93.91/93.35  all VarCurr B (range_31_0(B)-> (v2522(VarCurr,B)<->v2523(VarCurr,B)&v2530(VarCurr,B))).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2530(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr B (range_29_0(B)-> (v2523(VarCurr,B)<->v2524(VarCurr,B))).
% 93.91/93.35  all VarCurr ((v2523(VarCurr,bitIndex31)<->$F)& (v2523(VarCurr,bitIndex30)<->$F)).
% 93.91/93.35  all VarCurr B (range_29_0(B)-> (v2524(VarCurr,B)<->v2525(VarCurr,B)|v2527(VarCurr,B))).
% 93.91/93.35  all VarCurr B (range_29_0(B)-> (v2527(VarCurr,B)<->v2528(VarCurr,B)&v2529(VarCurr,B))).
% 93.91/93.35  all B (range_29_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2529(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr ((v2528(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex39))& (v2528(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex38))& (v2528(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex37))& (v2528(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex36))& (v2528(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex35))& (v2528(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex34))& (v2528(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex33))& (v2528(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex32))& (v2528(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex31))& (v2528(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex30))& (v2528(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex29))& (v2528(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex28))& (v2528(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex27))& (v2528(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex26))& (v2528(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex25))& (v2528(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex24))& (v2528(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex23))& (v2528(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex22))& (v2528(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex21))& (v2528(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex20))& (v2528(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex19))& (v2528(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex18))& (v2528(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex17))& (v2528(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex16))& (v2528(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex15))& (v2528(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex14))& (v2528(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex13))& (v2528(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex12))& (v2528(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex11))).
% 93.91/93.35  all VarCurr (v2528(VarCurr,bitIndex29)<->$F).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex10)&v2526(VarCurr,bitIndex0)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex11)&v2526(VarCurr,bitIndex1)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex12)&v2526(VarCurr,bitIndex2)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex13)&v2526(VarCurr,bitIndex3)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex14)&v2526(VarCurr,bitIndex4)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex15)&v2526(VarCurr,bitIndex5)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex16)&v2526(VarCurr,bitIndex6)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex17)&v2526(VarCurr,bitIndex7)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex18)&v2526(VarCurr,bitIndex8)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex19)&v2526(VarCurr,bitIndex9)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex20)&v2526(VarCurr,bitIndex10)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex21)&v2526(VarCurr,bitIndex11)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex22)&v2526(VarCurr,bitIndex12)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex23)&v2526(VarCurr,bitIndex13)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex24)&v2526(VarCurr,bitIndex14)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex25)&v2526(VarCurr,bitIndex15)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex26)&v2526(VarCurr,bitIndex16)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex27)&v2526(VarCurr,bitIndex17)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex28)&v2526(VarCurr,bitIndex18)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex29)&v2526(VarCurr,bitIndex19)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex30)&v2526(VarCurr,bitIndex20)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex31)&v2526(VarCurr,bitIndex21)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex32)&v2526(VarCurr,bitIndex22)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex33)&v2526(VarCurr,bitIndex23)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex34)&v2526(VarCurr,bitIndex24)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex35)&v2526(VarCurr,bitIndex25)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex36)&v2526(VarCurr,bitIndex26)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex37)&v2526(VarCurr,bitIndex27)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex38)&v2526(VarCurr,bitIndex28)).
% 93.91/93.35  all VarCurr (v2525(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex39)&v2526(VarCurr,bitIndex29)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.91/93.35  all VarCurr (v2526(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr (v2526(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 93.91/93.36  all VarCurr B (range_31_0(B)-> (v2514(VarCurr,B)<->v2515(VarCurr,B)&v2521(VarCurr,B))).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex20)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex21)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex22)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex23)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex24)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex25)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex26)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex27)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex28)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex29)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex30)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr (v2521(VarCurr,bitIndex31)<->v2473(VarCurr)).
% 93.91/93.36  all VarCurr B (range_31_0(B)-> (v2515(VarCurr,B)<->v2516(VarCurr,B)|v2518(VarCurr,B))).
% 93.91/93.36  all VarCurr B (range_31_0(B)-> (v2518(VarCurr,B)<->v2519(VarCurr,B)&v2520(VarCurr,B))).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.91/93.36  all VarCurr (v2520(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2520(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr ((v2519(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex39))& (v2519(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex38))& (v2519(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex37))& (v2519(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex36))& (v2519(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex35))& (v2519(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex34))& (v2519(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex33))& (v2519(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex32))& (v2519(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex31))& (v2519(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex30))& (v2519(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex29))& (v2519(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex28))& (v2519(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex27))& (v2519(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex26))& (v2519(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex25))& (v2519(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex24))& (v2519(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex23))& (v2519(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex22))& (v2519(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex21))& (v2519(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex20))& (v2519(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex19))& (v2519(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex18))& (v2519(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex17))& (v2519(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex16))& (v2519(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex15))& (v2519(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex14))& (v2519(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex13))& (v2519(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex12))& (v2519(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex11))& (v2519(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex10))& (v2519(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex9))).
% 93.99/93.36  all VarCurr (v2519(VarCurr,bitIndex31)<->$F).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex8)&v2517(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex9)&v2517(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex10)&v2517(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex11)&v2517(VarCurr,bitIndex3)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex12)&v2517(VarCurr,bitIndex4)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex13)&v2517(VarCurr,bitIndex5)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex14)&v2517(VarCurr,bitIndex6)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex15)&v2517(VarCurr,bitIndex7)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex16)&v2517(VarCurr,bitIndex8)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex17)&v2517(VarCurr,bitIndex9)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex18)&v2517(VarCurr,bitIndex10)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex19)&v2517(VarCurr,bitIndex11)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex20)&v2517(VarCurr,bitIndex12)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex21)&v2517(VarCurr,bitIndex13)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex22)&v2517(VarCurr,bitIndex14)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex23)&v2517(VarCurr,bitIndex15)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex24)&v2517(VarCurr,bitIndex16)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex25)&v2517(VarCurr,bitIndex17)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex26)&v2517(VarCurr,bitIndex18)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex27)&v2517(VarCurr,bitIndex19)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex28)&v2517(VarCurr,bitIndex20)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex29)&v2517(VarCurr,bitIndex21)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex30)&v2517(VarCurr,bitIndex22)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex31)&v2517(VarCurr,bitIndex23)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex32)&v2517(VarCurr,bitIndex24)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex33)&v2517(VarCurr,bitIndex25)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex34)&v2517(VarCurr,bitIndex26)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex35)&v2517(VarCurr,bitIndex27)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex36)&v2517(VarCurr,bitIndex28)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex37)&v2517(VarCurr,bitIndex29)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex38)&v2517(VarCurr,bitIndex30)).
% 93.99/93.36  all VarCurr (v2516(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex39)&v2517(VarCurr,bitIndex31)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex30)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr (v2517(VarCurr,bitIndex31)<->v2468(VarCurr)).
% 93.99/93.36  all VarCurr B (range_39_0(B)-> (v2458(VarCurr,B)<->v2459(VarCurr,B)&v2506(VarCurr,B))).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex0)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex1)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex2)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex3)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex4)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex5)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex6)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex7)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex8)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex9)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex10)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex11)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex12)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex13)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex14)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex15)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex16)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex17)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex18)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex19)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex20)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex21)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex22)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex23)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex24)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex25)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex26)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex27)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex28)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex29)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex30)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex31)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex32)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex33)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex34)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex35)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex36)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex37)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex38)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (v2506(VarCurr,bitIndex39)<->v2507(VarCurr)).
% 93.99/93.36  all VarCurr (-v2507(VarCurr)<->v2453(VarCurr,bitIndex3)).
% 93.99/93.36  all VarCurr B (range_39_0(B)-> (v2459(VarCurr,B)<->v2460(VarCurr,B)|v2485(VarCurr,B))).
% 93.99/93.36  all VarCurr B (range_39_0(B)-> (v2485(VarCurr,B)<->v2486(VarCurr,B)&v2505(VarCurr,B))).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex38)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr (v2505(VarCurr,bitIndex39)<->v2453(VarCurr,bitIndex2)).
% 93.99/93.36  all VarCurr B (range_35_0(B)-> (v2486(VarCurr,B)<->v2487(VarCurr,B))).
% 93.99/93.36  all VarCurr ((v2486(VarCurr,bitIndex39)<->$F)& (v2486(VarCurr,bitIndex38)<->$F)& (v2486(VarCurr,bitIndex37)<->$F)& (v2486(VarCurr,bitIndex36)<->$F)).
% 93.99/93.36  all VarCurr B (range_35_0(B)-> (v2487(VarCurr,B)<->v2488(VarCurr,B)|v2496(VarCurr,B))).
% 93.99/93.36  all VarCurr B (range_35_0(B)-> (v2496(VarCurr,B)<->v2497(VarCurr,B)&v2504(VarCurr,B))).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr (v2504(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex1)).
% 93.99/93.36  all VarCurr B (range_33_0(B)-> (v2497(VarCurr,B)<->v2498(VarCurr,B))).
% 93.99/93.36  all VarCurr ((v2497(VarCurr,bitIndex35)<->$F)& (v2497(VarCurr,bitIndex34)<->$F)).
% 93.99/93.36  all VarCurr B (range_33_0(B)-> (v2498(VarCurr,B)<->v2499(VarCurr,B)|v2501(VarCurr,B))).
% 93.99/93.36  all VarCurr B (range_33_0(B)-> (v2501(VarCurr,B)<->v2502(VarCurr,B)&v2503(VarCurr,B))).
% 93.99/93.36  all B (range_33_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B).
% 93.99/93.36  all VarCurr (v2503(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2503(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.36  all VarCurr (v2503(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2503(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr ((v2502(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex39))& (v2502(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex38))& (v2502(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex37))& (v2502(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex36))& (v2502(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex35))& (v2502(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex34))& (v2502(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex33))& (v2502(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex32))& (v2502(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex31))& (v2502(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex30))& (v2502(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex29))& (v2502(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex28))& (v2502(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex27))& (v2502(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex26))& (v2502(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex25))& (v2502(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex24))& (v2502(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex23))& (v2502(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex22))& (v2502(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex21))& (v2502(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex20))& (v2502(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex19))& (v2502(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex18))& (v2502(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex17))& (v2502(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex16))& (v2502(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex15))& (v2502(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex14))& (v2502(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex13))& (v2502(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex12))& (v2502(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex11))& (v2502(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex10))& (v2502(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex9))& (v2502(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex8))& (v2502(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex7))).
% 93.99/93.37  all VarCurr (v2502(VarCurr,bitIndex33)<->$F).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex6)&v2500(VarCurr,bitIndex0)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex7)&v2500(VarCurr,bitIndex1)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex8)&v2500(VarCurr,bitIndex2)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex9)&v2500(VarCurr,bitIndex3)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex10)&v2500(VarCurr,bitIndex4)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex11)&v2500(VarCurr,bitIndex5)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex12)&v2500(VarCurr,bitIndex6)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex13)&v2500(VarCurr,bitIndex7)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex14)&v2500(VarCurr,bitIndex8)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex15)&v2500(VarCurr,bitIndex9)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex16)&v2500(VarCurr,bitIndex10)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex17)&v2500(VarCurr,bitIndex11)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex18)&v2500(VarCurr,bitIndex12)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex19)&v2500(VarCurr,bitIndex13)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex20)&v2500(VarCurr,bitIndex14)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex21)&v2500(VarCurr,bitIndex15)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex22)&v2500(VarCurr,bitIndex16)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex23)&v2500(VarCurr,bitIndex17)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex24)&v2500(VarCurr,bitIndex18)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex25)&v2500(VarCurr,bitIndex19)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex26)&v2500(VarCurr,bitIndex20)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex27)&v2500(VarCurr,bitIndex21)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex28)&v2500(VarCurr,bitIndex22)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex29)&v2500(VarCurr,bitIndex23)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex30)&v2500(VarCurr,bitIndex24)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex31)&v2500(VarCurr,bitIndex25)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex32)&v2500(VarCurr,bitIndex26)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex33)&v2500(VarCurr,bitIndex27)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex34)&v2500(VarCurr,bitIndex28)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex35)&v2500(VarCurr,bitIndex29)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex36)&v2500(VarCurr,bitIndex30)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex37)&v2500(VarCurr,bitIndex31)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex38)&v2500(VarCurr,bitIndex32)).
% 93.99/93.37  all VarCurr (v2499(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex39)&v2500(VarCurr,bitIndex33)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 93.99/93.37  all VarCurr (v2500(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex30)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex31)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex32)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2500(VarCurr,bitIndex33)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr B (range_35_0(B)-> (v2488(VarCurr,B)<->v2489(VarCurr,B)&v2495(VarCurr,B))).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex20)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex21)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex22)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex23)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex24)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex25)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex26)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex27)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex28)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex29)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex30)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex31)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex32)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex33)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex34)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr (v2495(VarCurr,bitIndex35)<->v2473(VarCurr)).
% 94.00/93.37  all VarCurr B (range_35_0(B)-> (v2489(VarCurr,B)<->v2490(VarCurr,B)|v2492(VarCurr,B))).
% 94.00/93.37  all VarCurr B (range_35_0(B)-> (v2492(VarCurr,B)<->v2493(VarCurr,B)&v2494(VarCurr,B))).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2494(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr ((v2493(VarCurr,bitIndex34)<->v2465(VarCurr,bitIndex39))& (v2493(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex38))& (v2493(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex37))& (v2493(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex36))& (v2493(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex35))& (v2493(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex34))& (v2493(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex33))& (v2493(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex32))& (v2493(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex31))& (v2493(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex30))& (v2493(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex29))& (v2493(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex28))& (v2493(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex27))& (v2493(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex26))& (v2493(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex25))& (v2493(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex24))& (v2493(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex23))& (v2493(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex22))& (v2493(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex21))& (v2493(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex20))& (v2493(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex19))& (v2493(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex18))& (v2493(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex17))& (v2493(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex16))& (v2493(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex15))& (v2493(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex14))& (v2493(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex13))& (v2493(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex12))& (v2493(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex11))& (v2493(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex10))& (v2493(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex9))& (v2493(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex8))& (v2493(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex7))& (v2493(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex6))& (v2493(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex5))).
% 94.00/93.37  all VarCurr (v2493(VarCurr,bitIndex35)<->$F).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex4)&v2491(VarCurr,bitIndex0)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex5)&v2491(VarCurr,bitIndex1)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex6)&v2491(VarCurr,bitIndex2)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex7)&v2491(VarCurr,bitIndex3)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex8)&v2491(VarCurr,bitIndex4)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex9)&v2491(VarCurr,bitIndex5)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex10)&v2491(VarCurr,bitIndex6)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex11)&v2491(VarCurr,bitIndex7)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex12)&v2491(VarCurr,bitIndex8)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex13)&v2491(VarCurr,bitIndex9)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex14)&v2491(VarCurr,bitIndex10)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex15)&v2491(VarCurr,bitIndex11)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex16)&v2491(VarCurr,bitIndex12)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex17)&v2491(VarCurr,bitIndex13)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex18)&v2491(VarCurr,bitIndex14)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex19)&v2491(VarCurr,bitIndex15)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex20)&v2491(VarCurr,bitIndex16)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex21)&v2491(VarCurr,bitIndex17)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex22)&v2491(VarCurr,bitIndex18)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex23)&v2491(VarCurr,bitIndex19)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex24)&v2491(VarCurr,bitIndex20)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex25)&v2491(VarCurr,bitIndex21)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex26)&v2491(VarCurr,bitIndex22)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex27)&v2491(VarCurr,bitIndex23)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex28)&v2491(VarCurr,bitIndex24)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex29)&v2491(VarCurr,bitIndex25)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex30)&v2491(VarCurr,bitIndex26)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex31)&v2491(VarCurr,bitIndex27)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex32)&v2491(VarCurr,bitIndex28)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex33)&v2491(VarCurr,bitIndex29)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex34)&v2491(VarCurr,bitIndex30)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex35)&v2491(VarCurr,bitIndex31)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex36)&v2491(VarCurr,bitIndex32)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex37)&v2491(VarCurr,bitIndex33)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex34)<->v2465(VarCurr,bitIndex38)&v2491(VarCurr,bitIndex34)).
% 94.00/93.37  all VarCurr (v2490(VarCurr,bitIndex35)<->v2465(VarCurr,bitIndex39)&v2491(VarCurr,bitIndex35)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 94.00/93.37  all VarCurr (v2491(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex30)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex31)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex32)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex33)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex34)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2491(VarCurr,bitIndex35)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2460(VarCurr,B)<->v2461(VarCurr,B)&v2483(VarCurr,B))).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex0)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex1)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex2)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex3)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex4)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex5)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex6)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex7)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex8)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex9)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex10)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex11)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex12)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex13)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex14)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex15)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex16)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex17)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex18)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex19)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex20)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex21)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex22)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex23)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex24)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex25)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex26)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex27)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex28)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex29)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex30)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex31)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex32)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex33)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex34)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex35)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex36)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex37)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex38)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (v2483(VarCurr,bitIndex39)<->v2484(VarCurr)).
% 94.00/93.38  all VarCurr (-v2484(VarCurr)<->v2453(VarCurr,bitIndex2)).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2461(VarCurr,B)<->v2462(VarCurr,B)|v2474(VarCurr,B))).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2474(VarCurr,B)<->v2475(VarCurr,B)&v2482(VarCurr,B))).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex38)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2482(VarCurr,bitIndex39)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr B (range_37_0(B)-> (v2475(VarCurr,B)<->v2476(VarCurr,B))).
% 94.00/93.38  all VarCurr ((v2475(VarCurr,bitIndex39)<->$F)& (v2475(VarCurr,bitIndex38)<->$F)).
% 94.00/93.38  all VarCurr B (range_37_0(B)-> (v2476(VarCurr,B)<->v2477(VarCurr,B)|v2479(VarCurr,B))).
% 94.00/93.38  all VarCurr B (range_37_0(B)-> (v2479(VarCurr,B)<->v2480(VarCurr,B)&v2481(VarCurr,B))).
% 94.00/93.38  all B (range_37_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2481(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr ((v2480(VarCurr,bitIndex36)<->v2465(VarCurr,bitIndex39))& (v2480(VarCurr,bitIndex35)<->v2465(VarCurr,bitIndex38))& (v2480(VarCurr,bitIndex34)<->v2465(VarCurr,bitIndex37))& (v2480(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex36))& (v2480(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex35))& (v2480(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex34))& (v2480(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex33))& (v2480(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex32))& (v2480(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex31))& (v2480(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex30))& (v2480(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex29))& (v2480(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex28))& (v2480(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex27))& (v2480(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex26))& (v2480(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex25))& (v2480(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex24))& (v2480(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex23))& (v2480(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex22))& (v2480(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex21))& (v2480(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex20))& (v2480(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex19))& (v2480(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex18))& (v2480(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex17))& (v2480(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex16))& (v2480(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex15))& (v2480(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex14))& (v2480(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex13))& (v2480(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex12))& (v2480(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex11))& (v2480(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex10))& (v2480(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex9))& (v2480(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex8))& (v2480(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex7))& (v2480(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex6))& (v2480(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex5))& (v2480(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex4))& (v2480(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex3))).
% 94.00/93.38  all VarCurr (v2480(VarCurr,bitIndex37)<->$F).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex2)&v2478(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex3)&v2478(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex4)&v2478(VarCurr,bitIndex2)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex5)&v2478(VarCurr,bitIndex3)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex6)&v2478(VarCurr,bitIndex4)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex7)&v2478(VarCurr,bitIndex5)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex8)&v2478(VarCurr,bitIndex6)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex9)&v2478(VarCurr,bitIndex7)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex10)&v2478(VarCurr,bitIndex8)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex11)&v2478(VarCurr,bitIndex9)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex12)&v2478(VarCurr,bitIndex10)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex13)&v2478(VarCurr,bitIndex11)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex14)&v2478(VarCurr,bitIndex12)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex15)&v2478(VarCurr,bitIndex13)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex16)&v2478(VarCurr,bitIndex14)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex17)&v2478(VarCurr,bitIndex15)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex18)&v2478(VarCurr,bitIndex16)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex19)&v2478(VarCurr,bitIndex17)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex20)&v2478(VarCurr,bitIndex18)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex21)&v2478(VarCurr,bitIndex19)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex22)&v2478(VarCurr,bitIndex20)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex23)&v2478(VarCurr,bitIndex21)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex24)&v2478(VarCurr,bitIndex22)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex25)&v2478(VarCurr,bitIndex23)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex26)&v2478(VarCurr,bitIndex24)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex27)&v2478(VarCurr,bitIndex25)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex28)&v2478(VarCurr,bitIndex26)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex29)&v2478(VarCurr,bitIndex27)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex30)&v2478(VarCurr,bitIndex28)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex31)&v2478(VarCurr,bitIndex29)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex32)&v2478(VarCurr,bitIndex30)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex33)&v2478(VarCurr,bitIndex31)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex34)&v2478(VarCurr,bitIndex32)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex35)&v2478(VarCurr,bitIndex33)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex34)<->v2465(VarCurr,bitIndex36)&v2478(VarCurr,bitIndex34)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex35)<->v2465(VarCurr,bitIndex37)&v2478(VarCurr,bitIndex35)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex36)<->v2465(VarCurr,bitIndex38)&v2478(VarCurr,bitIndex36)).
% 94.00/93.38  all VarCurr (v2477(VarCurr,bitIndex37)<->v2465(VarCurr,bitIndex39)&v2478(VarCurr,bitIndex37)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex30)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex31)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex32)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex33)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex34)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex35)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex36)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr (v2478(VarCurr,bitIndex37)<->v2468(VarCurr)).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2462(VarCurr,B)<->v2463(VarCurr,B)&v2472(VarCurr,B))).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex0)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex1)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex2)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex3)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex4)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex5)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex6)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex7)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex8)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex9)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex10)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex11)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex12)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex13)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex14)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex15)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex16)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex17)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex18)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex19)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex20)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex21)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex22)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex23)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex24)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex25)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex26)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex27)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex28)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex29)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex30)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex31)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex32)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex33)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex34)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex35)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex36)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex37)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex38)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (v2472(VarCurr,bitIndex39)<->v2473(VarCurr)).
% 94.00/93.38  all VarCurr (-v2473(VarCurr)<->v2453(VarCurr,bitIndex1)).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2463(VarCurr,B)<->v2464(VarCurr,B)|v2469(VarCurr,B))).
% 94.00/93.38  all VarCurr B (range_39_0(B)-> (v2469(VarCurr,B)<->v2470(VarCurr,B)&v2471(VarCurr,B))).
% 94.00/93.38  all VarCurr (v2471(VarCurr,bitIndex0)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.38  all VarCurr (v2471(VarCurr,bitIndex1)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex2)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex3)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex4)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex5)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex6)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex7)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex8)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex9)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex10)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex11)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex12)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex13)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex14)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex15)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex16)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex17)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex18)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex19)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex20)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex21)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex22)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex23)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex24)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex25)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex26)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex27)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex28)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex29)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex30)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex31)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex32)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex33)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex34)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex35)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex36)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex37)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex38)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2471(VarCurr,bitIndex39)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr ((v2470(VarCurr,bitIndex38)<->v2465(VarCurr,bitIndex39))& (v2470(VarCurr,bitIndex37)<->v2465(VarCurr,bitIndex38))& (v2470(VarCurr,bitIndex36)<->v2465(VarCurr,bitIndex37))& (v2470(VarCurr,bitIndex35)<->v2465(VarCurr,bitIndex36))& (v2470(VarCurr,bitIndex34)<->v2465(VarCurr,bitIndex35))& (v2470(VarCurr,bitIndex33)<->v2465(VarCurr,bitIndex34))& (v2470(VarCurr,bitIndex32)<->v2465(VarCurr,bitIndex33))& (v2470(VarCurr,bitIndex31)<->v2465(VarCurr,bitIndex32))& (v2470(VarCurr,bitIndex30)<->v2465(VarCurr,bitIndex31))& (v2470(VarCurr,bitIndex29)<->v2465(VarCurr,bitIndex30))& (v2470(VarCurr,bitIndex28)<->v2465(VarCurr,bitIndex29))& (v2470(VarCurr,bitIndex27)<->v2465(VarCurr,bitIndex28))& (v2470(VarCurr,bitIndex26)<->v2465(VarCurr,bitIndex27))& (v2470(VarCurr,bitIndex25)<->v2465(VarCurr,bitIndex26))& (v2470(VarCurr,bitIndex24)<->v2465(VarCurr,bitIndex25))& (v2470(VarCurr,bitIndex23)<->v2465(VarCurr,bitIndex24))& (v2470(VarCurr,bitIndex22)<->v2465(VarCurr,bitIndex23))& (v2470(VarCurr,bitIndex21)<->v2465(VarCurr,bitIndex22))& (v2470(VarCurr,bitIndex20)<->v2465(VarCurr,bitIndex21))& (v2470(VarCurr,bitIndex19)<->v2465(VarCurr,bitIndex20))& (v2470(VarCurr,bitIndex18)<->v2465(VarCurr,bitIndex19))& (v2470(VarCurr,bitIndex17)<->v2465(VarCurr,bitIndex18))& (v2470(VarCurr,bitIndex16)<->v2465(VarCurr,bitIndex17))& (v2470(VarCurr,bitIndex15)<->v2465(VarCurr,bitIndex16))& (v2470(VarCurr,bitIndex14)<->v2465(VarCurr,bitIndex15))& (v2470(VarCurr,bitIndex13)<->v2465(VarCurr,bitIndex14))& (v2470(VarCurr,bitIndex12)<->v2465(VarCurr,bitIndex13))& (v2470(VarCurr,bitIndex11)<->v2465(VarCurr,bitIndex12))& (v2470(VarCurr,bitIndex10)<->v2465(VarCurr,bitIndex11))& (v2470(VarCurr,bitIndex9)<->v2465(VarCurr,bitIndex10))& (v2470(VarCurr,bitIndex8)<->v2465(VarCurr,bitIndex9))& (v2470(VarCurr,bitIndex7)<->v2465(VarCurr,bitIndex8))& (v2470(VarCurr,bitIndex6)<->v2465(VarCurr,bitIndex7))& (v2470(VarCurr,bitIndex5)<->v2465(VarCurr,bitIndex6))& (v2470(VarCurr,bitIndex4)<->v2465(VarCurr,bitIndex5))& (v2470(VarCurr,bitIndex3)<->v2465(VarCurr,bitIndex4))& (v2470(VarCurr,bitIndex2)<->v2465(VarCurr,bitIndex3))& (v2470(VarCurr,bitIndex1)<->v2465(VarCurr,bitIndex2))& (v2470(VarCurr,bitIndex0)<->v2465(VarCurr,bitIndex1))).
% 94.00/93.39  all VarCurr (v2470(VarCurr,bitIndex39)<->$F).
% 94.00/93.39  all VarCurr B (range_39_0(B)-> (v2464(VarCurr,B)<->v2465(VarCurr,B)&v2467(VarCurr,B))).
% 94.00/93.39  all B (range_39_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex0)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex1)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex2)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex3)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex4)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex5)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex6)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex7)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex8)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex9)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex10)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex11)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex12)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex13)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex14)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex15)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex16)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex17)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex18)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex19)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex20)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex21)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex22)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex23)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex24)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex25)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex26)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex27)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex28)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex29)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex30)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex31)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex32)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex33)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex34)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex35)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex36)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex37)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex38)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (v2467(VarCurr,bitIndex39)<->v2468(VarCurr)).
% 94.00/93.39  all VarCurr (-v2468(VarCurr)<->v2453(VarCurr,bitIndex0)).
% 94.00/93.39  all B (range_4_0(B)-> (v2453(constB0,B)<->$F)).
% 94.00/93.39  all VarCurr B (range_31_0(B)-> (v2465(VarCurr,B)<->v2451(VarCurr,B))).
% 94.00/93.39  all VarCurr ((v2465(VarCurr,bitIndex39)<->v2451(VarCurr,bitIndex7))& (v2465(VarCurr,bitIndex38)<->v2451(VarCurr,bitIndex6))& (v2465(VarCurr,bitIndex37)<->v2451(VarCurr,bitIndex5))& (v2465(VarCurr,bitIndex36)<->v2451(VarCurr,bitIndex4))& (v2465(VarCurr,bitIndex35)<->v2451(VarCurr,bitIndex3))& (v2465(VarCurr,bitIndex34)<->v2451(VarCurr,bitIndex2))& (v2465(VarCurr,bitIndex33)<->v2451(VarCurr,bitIndex1))& (v2465(VarCurr,bitIndex32)<->v2451(VarCurr,bitIndex0))).
% 94.00/93.39  all B (range_31_0(B)-> (v2451(constB0,B)<->$T)).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex31).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex30).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex29).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex28).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex27).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex26).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex25).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex24).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex23).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex22).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex21).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex20).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex19).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex18).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex17).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex16).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex15).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex14).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex13).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex12).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex11).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex10).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex9).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex8).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex7).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex6).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex5).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex4).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex3).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex2).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex1).
% 94.00/93.39  b11111111111111111111111111111111(bitIndex0).
% 94.00/93.39  all VarCurr (v2426(VarCurr)<->v2437(VarCurr)&v2439(VarCurr)).
% 94.00/93.39  all VarCurr (-v2439(VarCurr)<->v2322(VarCurr)).
% 94.00/93.39  all VarCurr (v2437(VarCurr)<->v2438(VarCurr)&v2365(VarCurr)).
% 94.00/93.39  all VarCurr (v2438(VarCurr)<->v2304(VarCurr)&v2428(VarCurr)).
% 94.00/93.39  all VarCurr (v2428(VarCurr)<->v2430(VarCurr)).
% 94.00/93.39  all VarCurr (v2430(VarCurr)<->v2432(VarCurr)).
% 94.00/93.39  all VarCurr (-v2434(VarCurr)-> (v2432(VarCurr)<->$F)).
% 94.00/93.39  all VarCurr (v2434(VarCurr)-> (v2432(VarCurr)<->$T)).
% 94.00/93.39  all VarCurr (v2434(VarCurr)<->v2435(VarCurr)&v170(VarCurr)).
% 94.00/93.39  all VarCurr (-v2435(VarCurr)<->v145(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2422(VarCurr)<->v2424(VarCurr)).
% 94.00/93.39  all VarCurr (v2424(VarCurr)<->v149(VarCurr,bitIndex53)).
% 94.00/93.39  all VarCurr (v149(VarCurr,bitIndex53)<->v151(VarCurr,bitIndex53)).
% 94.00/93.39  all VarCurr (v151(VarCurr,bitIndex53)<->v156(VarCurr,bitIndex53)).
% 94.00/93.39  all VarCurr (v2412(VarCurr)<->v2414(VarCurr)&v2416(VarCurr)).
% 94.00/93.39  all VarCurr (-v2416(VarCurr)<->v2322(VarCurr)).
% 94.00/93.39  all VarCurr (v2414(VarCurr)<->v2306(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2306(VarCurr,bitIndex0)<->v2394(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (-v2409(VarCurr)-> (v2302(VarCurr,bitIndex9)<->$F)).
% 94.00/93.39  all VarCurr (v2409(VarCurr)-> (v2302(VarCurr,bitIndex9)<->$T)).
% 94.00/93.39  all VarCurr (v2409(VarCurr)<->v2304(VarCurr)&v2410(VarCurr)).
% 94.00/93.39  all VarCurr (v2410(VarCurr)<-> ($T<->v2397(VarCurr,bitIndex8))).
% 94.00/93.39  all VarCurr (-v2406(VarCurr)-> (v2302(VarCurr,bitIndex6)<->$F)).
% 94.00/93.39  all VarCurr (v2406(VarCurr)-> (v2302(VarCurr,bitIndex6)<->$T)).
% 94.00/93.39  all VarCurr (v2406(VarCurr)<->v2304(VarCurr)&v2407(VarCurr)).
% 94.00/93.39  all VarCurr (v2407(VarCurr)<-> ($T<->v2397(VarCurr,bitIndex5))).
% 94.00/93.39  all VarCurr (-v2399(VarCurr)-> (v2302(VarCurr,bitIndex3)<->$F)).
% 94.00/93.39  all VarCurr (v2399(VarCurr)-> (v2302(VarCurr,bitIndex3)<->$T)).
% 94.00/93.39  all VarCurr (v2399(VarCurr)<->v2400(VarCurr)&v2402(VarCurr)).
% 94.00/93.39  all VarCurr (v2402(VarCurr)<-> ($T<->v2397(VarCurr,bitIndex2))).
% 94.00/93.39  -v2397(constB0,bitIndex11).
% 94.00/93.39  -v2397(constB0,bitIndex10).
% 94.00/93.39  -v2397(constB0,bitIndex8).
% 94.00/93.39  -v2397(constB0,bitIndex7).
% 94.00/93.39  -v2397(constB0,bitIndex5).
% 94.00/93.39  -v2397(constB0,bitIndex4).
% 94.00/93.39  -v2397(constB0,bitIndex2).
% 94.00/93.39  -v2397(constB0,bitIndex1).
% 94.00/93.39  -bx00x00x00x00(bitIndex10).
% 94.00/93.39  -bx00x00x00x00(bitIndex9).
% 94.00/93.39  -bx00x00x00x00(bitIndex7).
% 94.00/93.39  -bx00x00x00x00(bitIndex6).
% 94.00/93.39  -bx00x00x00x00(bitIndex4).
% 94.00/93.39  -bx00x00x00x00(bitIndex3).
% 94.00/93.39  -bx00x00x00x00(bitIndex1).
% 94.00/93.39  -bx00x00x00x00(bitIndex0).
% 94.00/93.39  all VarCurr (v2400(VarCurr)<->v2365(VarCurr)&v2304(VarCurr)).
% 94.00/93.39  all VarCurr (v2304(VarCurr)<->v2306(VarCurr,bitIndex1)).
% 94.00/93.39  all VarCurr (v2306(VarCurr,bitIndex1)<->v2394(VarCurr,bitIndex1)).
% 94.00/93.39  all VarCurr B (range_1_0(B)-> (v2394(VarCurr,B)<->v2308(VarCurr,B)&v2395(VarCurr,B))).
% 94.00/93.39  all VarCurr B (range_1_0(B)-> (v2395(VarCurr,B)<-> -v2338(VarCurr,B))).
% 94.00/93.39  all VarCurr (v2338(VarCurr,bitIndex1)<->v2338(VarCurr,bitIndex0)|v2308(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2308(VarCurr,bitIndex0)<->v2335(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v2336(VarCurr)<->v2390(VarCurr)&v2392(VarCurr)).
% 94.00/93.39  all VarCurr (-v2392(VarCurr)<->v2312(VarCurr)).
% 94.00/93.39  all VarCurr (v2390(VarCurr)<->v2391(VarCurr)&v2365(VarCurr)).
% 94.00/93.39  all VarCurr (v2391(VarCurr)<-> -(v2341(VarCurr)<->v2363(VarCurr))).
% 94.00/93.39  all VarCurr (v2365(VarCurr)<->v2367(VarCurr)).
% 94.00/93.39  all VarCurr (v2367(VarCurr)<->v2369(VarCurr)).
% 94.00/93.39  all VarCurr (v2369(VarCurr)<->v2374(VarCurr)|v2371(VarCurr,bitIndex15)).
% 94.00/93.39  all VarCurr (v2374(VarCurr)<->v2375(VarCurr)|v2371(VarCurr,bitIndex14)).
% 94.00/93.39  all VarCurr (v2375(VarCurr)<->v2376(VarCurr)|v2371(VarCurr,bitIndex13)).
% 94.00/93.39  all VarCurr (v2376(VarCurr)<->v2377(VarCurr)|v2371(VarCurr,bitIndex12)).
% 94.00/93.39  all VarCurr (v2377(VarCurr)<->v2378(VarCurr)|v2371(VarCurr,bitIndex11)).
% 94.00/93.39  all VarCurr (v2378(VarCurr)<->v2379(VarCurr)|v2371(VarCurr,bitIndex10)).
% 94.00/93.39  all VarCurr (v2379(VarCurr)<->v2380(VarCurr)|v2371(VarCurr,bitIndex9)).
% 94.00/93.39  all VarCurr (v2380(VarCurr)<->v2381(VarCurr)|v2371(VarCurr,bitIndex8)).
% 94.00/93.39  all VarCurr (v2381(VarCurr)<->v2382(VarCurr)|v2371(VarCurr,bitIndex7)).
% 94.00/93.39  all VarCurr (v2382(VarCurr)<->v2383(VarCurr)|v2371(VarCurr,bitIndex6)).
% 94.00/93.39  all VarCurr (v2383(VarCurr)<->v2384(VarCurr)|v2371(VarCurr,bitIndex5)).
% 94.00/93.39  all VarCurr (v2384(VarCurr)<->v2385(VarCurr)|v2371(VarCurr,bitIndex4)).
% 94.00/93.39  all VarCurr (v2385(VarCurr)<->v2386(VarCurr)|v2371(VarCurr,bitIndex3)).
% 94.00/93.39  all VarCurr (v2386(VarCurr)<->v2387(VarCurr)|v2371(VarCurr,bitIndex2)).
% 94.00/93.39  all VarCurr (v2387(VarCurr)<->v2371(VarCurr,bitIndex0)|v2371(VarCurr,bitIndex1)).
% 94.00/93.39  all B (range_15_0(B)-> (v2371(constB0,B)<->$T)).
% 94.00/93.39  b1111111111111111(bitIndex15).
% 94.00/93.39  b1111111111111111(bitIndex14).
% 94.00/93.39  b1111111111111111(bitIndex13).
% 94.00/93.39  b1111111111111111(bitIndex12).
% 94.00/93.39  b1111111111111111(bitIndex11).
% 94.00/93.39  b1111111111111111(bitIndex10).
% 94.00/93.39  b1111111111111111(bitIndex9).
% 94.00/93.39  b1111111111111111(bitIndex8).
% 94.00/93.39  b1111111111111111(bitIndex7).
% 94.00/93.39  b1111111111111111(bitIndex6).
% 94.00/93.39  b1111111111111111(bitIndex5).
% 94.00/93.39  b1111111111111111(bitIndex4).
% 94.00/93.39  b1111111111111111(bitIndex3).
% 94.00/93.39  b1111111111111111(bitIndex2).
% 94.00/93.39  b1111111111111111(bitIndex1).
% 94.00/93.39  b1111111111111111(bitIndex0).
% 94.00/93.39  all VarCurr (v2341(VarCurr)<->v2343(VarCurr)).
% 94.00/93.39  all VarCurr (v2343(VarCurr)<->v2345(VarCurr)).
% 94.00/93.39  all VarCurr (v2345(VarCurr)<->v2347(VarCurr)).
% 94.00/93.39  all VarCurr (v2347(VarCurr)<->v2349(VarCurr)).
% 94.00/93.39  all VarCurr (v2349(VarCurr)<->v2351(VarCurr)).
% 94.00/93.39  all VarCurr (v2351(VarCurr)<->v2353(VarCurr)).
% 94.00/93.39  all VarCurr (v2353(VarCurr)<->v2355(VarCurr)).
% 94.00/93.39  all VarCurr (v2355(VarCurr)<->v2357(VarCurr)).
% 94.00/93.39  all VarCurr (v2357(VarCurr)<->v2359(VarCurr)).
% 94.00/93.39  all VarCurr (v2359(VarCurr)<->v2361(VarCurr)).
% 94.00/93.39  v2361(constB0)<->$F.
% 94.00/93.39  all VarCurr (v2338(VarCurr,bitIndex0)<->$F).
% 94.00/93.39  all VarCurr (v2308(VarCurr,bitIndex1)<->v2335(VarCurr,bitIndex1)).
% 94.00/93.39  all VarCurr (v2335(VarCurr,bitIndex0)<->v2336(VarCurr)).
% 94.00/93.39  all VarCurr (v2335(VarCurr,bitIndex1)<->v2310(VarCurr)).
% 94.00/93.39  all VarCurr (v2310(VarCurr)<->v2331(VarCurr)&v2334(VarCurr)).
% 94.00/93.39  all VarCurr (-v2334(VarCurr)<->v2320(VarCurr)).
% 94.00/93.39  all VarCurr (v2331(VarCurr)<->v2332(VarCurr)&v2333(VarCurr)).
% 94.00/93.39  all VarCurr (-v2333(VarCurr)<->v2312(VarCurr)).
% 94.00/93.39  all VarCurr (-v2332(VarCurr)<->v129(VarCurr)).
% 94.00/93.39  all VarCurr (v2320(VarCurr)<->v2328(VarCurr)|v2326(VarCurr)).
% 94.00/93.39  v2326(constB0)<->$F.
% 94.00/93.39  all VarCurr (v2328(VarCurr)<->v2322(VarCurr)&v2329(VarCurr)).
% 94.00/93.39  all VarCurr (-v2329(VarCurr)<->v2324(VarCurr)).
% 94.00/93.39  v2324(constB0)<->$F.
% 94.00/93.39  v2322(constB0)<->$F.
% 94.00/93.39  all VarCurr (v2312(VarCurr)<->v2314(VarCurr)).
% 94.00/93.39  all VarCurr (v2314(VarCurr)<->v2316(VarCurr)).
% 94.00/93.39  all VarCurr (v2316(VarCurr)<->v2318(VarCurr)).
% 94.00/93.39  all VarCurr (v2275(VarCurr)<->v2278(VarCurr)&v875(VarCurr)).
% 94.00/93.39  all VarCurr (v2278(VarCurr)<->v2279(VarCurr)|v2288(VarCurr)).
% 94.00/93.39  all VarCurr (v2288(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$T)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.00/93.39  all VarCurr (v2279(VarCurr)<->v2280(VarCurr)|v2287(VarCurr)).
% 94.00/93.39  all VarCurr (v2287(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$T)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.00/93.39  b1110(bitIndex3).
% 94.00/93.39  b1110(bitIndex2).
% 94.00/93.39  b1110(bitIndex1).
% 94.00/93.39  -b1110(bitIndex0).
% 94.00/93.39  all VarCurr (v2280(VarCurr)<->v2281(VarCurr)|v2286(VarCurr)).
% 94.00/93.39  all VarCurr (v2286(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.00/93.39  all VarCurr (v2281(VarCurr)<->v2282(VarCurr)|v2285(VarCurr)).
% 94.00/93.39  all VarCurr (v2285(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.00/93.39  all VarCurr (v2282(VarCurr)<->v2283(VarCurr)|v2284(VarCurr)).
% 94.00/93.39  all VarCurr (v2284(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$F)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.00/93.39  all VarCurr (v2283(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$T)& (v743(VarCurr,bitIndex2)<->$F)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.00/93.39  b1000(bitIndex3).
% 94.00/93.39  -b1000(bitIndex2).
% 94.00/93.39  -b1000(bitIndex1).
% 94.00/93.39  -b1000(bitIndex0).
% 94.00/93.39  all VarCurr (v2265(VarCurr)<->v2267(VarCurr)&v875(VarCurr)).
% 94.00/93.39  all VarCurr (v2267(VarCurr)<->v2268(VarCurr)|v2273(VarCurr)).
% 94.00/93.39  all VarCurr (v2273(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.00/93.39  all VarCurr (v2268(VarCurr)<->v2269(VarCurr)|v2272(VarCurr)).
% 94.00/93.39  all VarCurr (v2272(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.00/93.39  all VarCurr (v2269(VarCurr)<->v2270(VarCurr)|v2271(VarCurr)).
% 94.00/93.39  all VarCurr (v2271(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$F)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.00/93.39  all VarCurr (v2270(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$F)& (v743(VarCurr,bitIndex1)<->$F)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.00/93.39  all VarCurr (v1908(VarCurr)<->v1910(VarCurr)).
% 94.00/93.39  all VarCurr (v1910(VarCurr)<->v1912(VarCurr)).
% 94.00/93.39  all VarCurr (v1912(VarCurr)<->v1914(VarCurr)).
% 94.00/93.39  all VarCurr (v1914(VarCurr)<->v1916(VarCurr)).
% 94.00/93.39  all VarCurr (v1916(VarCurr)<->v1918(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1918(VarCurr,bitIndex0)<->v1920(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1920(VarCurr,bitIndex0)<->v1922(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1922(VarCurr,bitIndex0)<->v1924(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1924(VarCurr,bitIndex0)<->v1926(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1926(VarCurr,bitIndex0)<->v1928(VarCurr,bitIndex0)).
% 94.00/93.39  all VarCurr (v1928(VarCurr,bitIndex0)<->v1930(VarCurr)).
% 94.00/93.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2214(VarNext)-> (v1930(VarNext)<->v1930(VarCurr)))).
% 94.00/93.39  all VarNext (v2214(VarNext)-> (v1930(VarNext)<->v2249(VarNext))).
% 94.00/93.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2249(VarNext)<->v2247(VarCurr))).
% 94.00/93.39  all VarCurr (-v1932(VarCurr)-> (v2247(VarCurr)<->v2250(VarCurr))).
% 94.00/93.39  all VarCurr (v1932(VarCurr)-> (v2247(VarCurr)<->v1955(VarCurr))).
% 94.00/93.39  all VarCurr (-v2227(VarCurr)-> (v2250(VarCurr)<->v2203(VarCurr))).
% 94.00/93.39  all VarCurr (v2227(VarCurr)-> (v2250(VarCurr)<->v2251(VarCurr))).
% 94.00/93.39  all VarCurr (-v2230(VarCurr)& -v2232(VarCurr)-> (v2251(VarCurr)<->v2255(VarCurr))).
% 94.00/93.39  all VarCurr (v2232(VarCurr)-> (v2251(VarCurr)<->v2254(VarCurr))).
% 94.00/93.39  all VarCurr (v2230(VarCurr)-> (v2251(VarCurr)<->v2252(VarCurr))).
% 94.00/93.39  all VarCurr (-v2240(VarCurr)-> (v2255(VarCurr)<->v2203(VarCurr))).
% 94.00/93.39  all VarCurr (v2240(VarCurr)-> (v2255(VarCurr)<->$T)).
% 94.00/93.39  all VarCurr (-v2234(VarCurr)-> (v2254(VarCurr)<->v2203(VarCurr))).
% 94.00/93.39  all VarCurr (v2234(VarCurr)-> (v2254(VarCurr)<->$F)).
% 94.00/93.39  all VarCurr (-v2253(VarCurr)-> (v2252(VarCurr)<->$F)).
% 94.00/93.39  all VarCurr (v2253(VarCurr)-> (v2252(VarCurr)<->$T)).
% 94.00/93.39  all VarCurr (v2253(VarCurr)<-> (v1964(VarCurr)<->$T)).
% 94.00/93.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2214(VarNext)<->v2215(VarNext)&v2224(VarNext))).
% 94.00/93.39  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2224(VarNext)<->v2222(VarCurr))).
% 94.00/93.39  all VarCurr (v2222(VarCurr)<->v1932(VarCurr)|v2225(VarCurr)).
% 94.00/93.39  all VarCurr (v2225(VarCurr)<->v2226(VarCurr)&v2246(VarCurr)).
% 94.00/93.40  all VarCurr (-v2246(VarCurr)<->v1932(VarCurr)).
% 94.00/93.40  all VarCurr (v2226(VarCurr)<->v2227(VarCurr)|v2244(VarCurr)).
% 94.00/93.40  all VarCurr (v2244(VarCurr)<->v2039(VarCurr)&v2245(VarCurr)).
% 94.00/93.40  all VarCurr (-v2245(VarCurr)<->v2043(VarCurr)).
% 94.00/93.40  all VarCurr (v2227(VarCurr)<->v2228(VarCurr)&v2043(VarCurr)).
% 94.00/93.40  all VarCurr (v2228(VarCurr)<->v2229(VarCurr)|v2238(VarCurr)).
% 94.00/93.40  all VarCurr (v2238(VarCurr)<->v2239(VarCurr)&v2243(VarCurr)).
% 94.00/93.40  all VarCurr (v2243(VarCurr)<-> (v2231(VarCurr,bitIndex2)<->$F)& (v2231(VarCurr,bitIndex1)<->$F)& (v2231(VarCurr,bitIndex0)<->$T)).
% 94.00/93.40  -b001(bitIndex2).
% 94.00/93.40  -b001(bitIndex1).
% 94.00/93.40  b001(bitIndex0).
% 94.00/93.40  all VarCurr (v2239(VarCurr)<->v2240(VarCurr)|v2241(VarCurr)).
% 94.00/93.40  all VarCurr (v2241(VarCurr)<->v2039(VarCurr)&v2242(VarCurr)).
% 94.00/93.40  all VarCurr (-v2242(VarCurr)<->v2240(VarCurr)).
% 94.00/93.40  all VarCurr (v2240(VarCurr)<-> (v1964(VarCurr)<->$T)).
% 94.00/93.40  all VarCurr (v2229(VarCurr)<->v2230(VarCurr)|v2232(VarCurr)).
% 94.00/93.40  all VarCurr (v2232(VarCurr)<->v2233(VarCurr)&v2237(VarCurr)).
% 94.00/93.40  all VarCurr (v2237(VarCurr)<-> (v2231(VarCurr,bitIndex2)<->$F)& (v2231(VarCurr,bitIndex1)<->$T)& (v2231(VarCurr,bitIndex0)<->$F)).
% 94.00/93.40  -b010(bitIndex2).
% 94.00/93.40  b010(bitIndex1).
% 94.00/93.40  -b010(bitIndex0).
% 94.00/93.40  all VarCurr (v2233(VarCurr)<->v2234(VarCurr)|v2235(VarCurr)).
% 94.00/93.40  all VarCurr (v2235(VarCurr)<->v2039(VarCurr)&v2236(VarCurr)).
% 94.00/93.40  all VarCurr (-v2236(VarCurr)<->v2234(VarCurr)).
% 94.00/93.40  all VarCurr (v2234(VarCurr)<-> (v1964(VarCurr)<->$T)).
% 94.00/93.40  all VarCurr (v2230(VarCurr)<-> (v2231(VarCurr,bitIndex2)<->$T)& (v2231(VarCurr,bitIndex1)<->$F)& (v2231(VarCurr,bitIndex0)<->$F)).
% 94.00/93.40  b100(bitIndex2).
% 94.00/93.40  -b100(bitIndex1).
% 94.00/93.40  -b100(bitIndex0).
% 94.00/93.40  all VarCurr (v2231(VarCurr,bitIndex0)<->v1961(VarCurr)).
% 94.00/93.40  all VarCurr (v2231(VarCurr,bitIndex1)<->v1959(VarCurr)).
% 94.00/93.40  all VarCurr (v2231(VarCurr,bitIndex2)<->v1957(VarCurr)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2215(VarNext)<->v2216(VarNext)&v2205(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2216(VarNext)<->v2218(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2218(VarNext)<->v2205(VarCurr))).
% 94.00/93.40  all VarCurr (v2205(VarCurr)<->v2207(VarCurr)).
% 94.00/93.40  all VarCurr (v2207(VarCurr)<->v2209(VarCurr)).
% 94.00/93.40  all VarCurr (v2209(VarCurr)<->v2211(VarCurr)).
% 94.00/93.40  all VarCurr (v2211(VarCurr)<->v2015(VarCurr)).
% 94.00/93.40  all VarCurr (v2203(VarCurr)<->$F).
% 94.00/93.40  all VarCurr (v2043(VarCurr)<->v2045(VarCurr)).
% 94.00/93.40  all VarCurr (v2045(VarCurr)<->v2047(VarCurr)).
% 94.00/93.40  all VarCurr (v2047(VarCurr)<->v2049(VarCurr)).
% 94.00/93.40  all VarCurr (v2049(VarCurr)<->v2051(VarCurr)&v2151(VarCurr)).
% 94.00/93.40  all VarCurr (v2151(VarCurr)<->v2153(VarCurr)).
% 94.00/93.40  all VarCurr (v2153(VarCurr)<->v2155(VarCurr)).
% 94.00/93.40  all VarCurr (v2155(VarCurr)<->v2157(VarCurr)).
% 94.00/93.40  all VarCurr (v2157(VarCurr)<->v2159(VarCurr)).
% 94.00/93.40  all VarCurr (v2159(VarCurr)<->v2161(VarCurr)).
% 94.00/93.40  all VarCurr (v2161(VarCurr)<->v2163(VarCurr)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2190(VarNext)-> (v2163(VarNext)<->v2163(VarCurr)))).
% 94.00/93.40  all VarNext (v2190(VarNext)-> (v2163(VarNext)<->v2198(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2198(VarNext)<->v2196(VarCurr))).
% 94.00/93.40  all VarCurr (-v2035(VarCurr)-> (v2196(VarCurr)<->v2165(VarCurr))).
% 94.00/93.40  all VarCurr (v2035(VarCurr)-> (v2196(VarCurr)<->$F)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2190(VarNext)<->v2191(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2191(VarNext)<->v2193(VarNext)&v2013(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2193(VarNext)<->v2028(VarNext))).
% 94.00/93.40  v2163(constB0)<->$F.
% 94.00/93.40  all VarCurr (v2165(VarCurr)<->v2167(VarCurr)).
% 94.00/93.40  all VarCurr (v2167(VarCurr)<->v2169(VarCurr)).
% 94.00/93.40  all VarCurr (v2169(VarCurr)<->v2171(VarCurr)).
% 94.00/93.40  all VarCurr (v2171(VarCurr)<->v2173(VarCurr)).
% 94.00/93.40  all VarCurr (v2173(VarCurr)<->v2175(VarCurr)).
% 94.00/93.40  all VarCurr (v2175(VarCurr)<->v2177(VarCurr)).
% 94.00/93.40  all VarCurr (v2177(VarCurr)<->v2179(VarCurr)).
% 94.00/93.40  all VarCurr (v2179(VarCurr)<->v2181(VarCurr)).
% 94.00/93.40  all VarCurr (v2181(VarCurr)<->v2183(VarCurr)).
% 94.00/93.40  all VarCurr (v2183(VarCurr)<->v2185(VarCurr)).
% 94.00/93.40  all VarCurr (v2185(VarCurr)<->v2187(VarCurr)).
% 94.00/93.40  v2187(constB0)<->$F.
% 94.00/93.40  all VarCurr (v2051(VarCurr)<->v2053(VarCurr)).
% 94.00/93.40  all VarCurr (v2053(VarCurr)<->v2055(VarCurr)).
% 94.00/93.40  all VarCurr (v2055(VarCurr)<->v2057(VarCurr)).
% 94.00/93.40  all VarCurr (v2057(VarCurr)<->v2059(VarCurr)).
% 94.00/93.40  all VarCurr (v2059(VarCurr)<->v2061(VarCurr)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2136(VarNext)-> (v2061(VarNext)<->v2061(VarCurr)))).
% 94.00/93.40  all VarNext (v2136(VarNext)-> (v2061(VarNext)<->v2144(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2144(VarNext)<->v2142(VarCurr))).
% 94.00/93.40  all VarCurr (-v2145(VarCurr)-> (v2142(VarCurr)<->v2146(VarCurr))).
% 94.00/93.40  all VarCurr (v2145(VarCurr)-> (v2142(VarCurr)<->$F)).
% 94.00/93.40  all VarCurr (v2146(VarCurr)<->v2147(VarCurr)&v2065(VarCurr)).
% 94.00/93.40  all VarCurr (v2147(VarCurr)<->v2063(VarCurr)).
% 94.00/93.40  v2063(constB0)<->$F.
% 94.00/93.40  all VarCurr (-v2145(VarCurr)<->v1984(VarCurr)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2136(VarNext)<->v2137(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2137(VarNext)<->v2138(VarNext)&v2013(VarNext))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2138(VarNext)<->v2028(VarNext))).
% 94.00/93.40  v2061(constB0)<->$F.
% 94.00/93.40  all VarCurr (-v2128(VarCurr)-> (v2065(VarCurr)<->v2129(VarCurr))).
% 94.00/93.40  all VarCurr (v2128(VarCurr)-> (v2065(VarCurr)<->$F)).
% 94.00/93.40  all VarCurr (-v2130(VarCurr)& -v2132(VarCurr)-> (v2129(VarCurr)<->$F)).
% 94.00/93.40  all VarCurr (v2132(VarCurr)-> (v2129(VarCurr)<->v2133(VarCurr))).
% 94.00/93.40  all VarCurr (v2130(VarCurr)-> (v2129(VarCurr)<->v2131(VarCurr))).
% 94.00/93.40  all VarCurr (v2133(VarCurr)<-> (v2101(VarCurr,bitIndex26)<->$F)& (v2101(VarCurr,bitIndex25)<->$F)& (v2101(VarCurr,bitIndex24)<->$F)& (v2101(VarCurr,bitIndex23)<->$F)& (v2101(VarCurr,bitIndex22)<->$F)& (v2101(VarCurr,bitIndex21)<->$F)& (v2101(VarCurr,bitIndex20)<->$F)& (v2101(VarCurr,bitIndex19)<->$T)& (v2101(VarCurr,bitIndex18)<->$T)& (v2101(VarCurr,bitIndex17)<->$T)& (v2101(VarCurr,bitIndex16)<->$F)& (v2101(VarCurr,bitIndex15)<->$F)& (v2101(VarCurr,bitIndex14)<->$T)& (v2101(VarCurr,bitIndex13)<->$T)& (v2101(VarCurr,bitIndex12)<->$F)& (v2101(VarCurr,bitIndex11)<->$T)& (v2101(VarCurr,bitIndex10)<->$F)& (v2101(VarCurr,bitIndex9)<->$F)& (v2101(VarCurr,bitIndex8)<->$F)& (v2101(VarCurr,bitIndex7)<->$F)& (v2101(VarCurr,bitIndex6)<->$F)& (v2101(VarCurr,bitIndex5)<->$F)& (v2101(VarCurr,bitIndex4)<->$F)& (v2101(VarCurr,bitIndex3)<->$F)& (v2101(VarCurr,bitIndex2)<->$F)& (v2101(VarCurr,bitIndex1)<->$T)& (v2101(VarCurr,bitIndex0)<->$T)).
% 94.00/93.40  -b000000011100110100000000011(bitIndex26).
% 94.00/93.40  -b000000011100110100000000011(bitIndex25).
% 94.00/93.40  -b000000011100110100000000011(bitIndex24).
% 94.00/93.40  -b000000011100110100000000011(bitIndex23).
% 94.00/93.40  -b000000011100110100000000011(bitIndex22).
% 94.00/93.40  -b000000011100110100000000011(bitIndex21).
% 94.00/93.40  -b000000011100110100000000011(bitIndex20).
% 94.00/93.40  b000000011100110100000000011(bitIndex19).
% 94.00/93.40  b000000011100110100000000011(bitIndex18).
% 94.00/93.40  b000000011100110100000000011(bitIndex17).
% 94.00/93.40  -b000000011100110100000000011(bitIndex16).
% 94.00/93.40  -b000000011100110100000000011(bitIndex15).
% 94.00/93.40  b000000011100110100000000011(bitIndex14).
% 94.00/93.40  b000000011100110100000000011(bitIndex13).
% 94.00/93.40  -b000000011100110100000000011(bitIndex12).
% 94.00/93.40  b000000011100110100000000011(bitIndex11).
% 94.00/93.40  -b000000011100110100000000011(bitIndex10).
% 94.00/93.40  -b000000011100110100000000011(bitIndex9).
% 94.00/93.40  -b000000011100110100000000011(bitIndex8).
% 94.00/93.40  -b000000011100110100000000011(bitIndex7).
% 94.00/93.40  -b000000011100110100000000011(bitIndex6).
% 94.00/93.40  -b000000011100110100000000011(bitIndex5).
% 94.00/93.40  -b000000011100110100000000011(bitIndex4).
% 94.00/93.40  -b000000011100110100000000011(bitIndex3).
% 94.00/93.40  -b000000011100110100000000011(bitIndex2).
% 94.00/93.40  b000000011100110100000000011(bitIndex1).
% 94.00/93.40  b000000011100110100000000011(bitIndex0).
% 94.00/93.40  all VarCurr (v2132(VarCurr)<-> (v2091(VarCurr)<->$T)).
% 94.00/93.40  all VarCurr (v2131(VarCurr)<-> (v2101(VarCurr,bitIndex26)<->$F)& (v2101(VarCurr,bitIndex25)<->$F)& (v2101(VarCurr,bitIndex24)<->$F)& (v2101(VarCurr,bitIndex23)<->$F)& (v2101(VarCurr,bitIndex22)<->$F)& (v2101(VarCurr,bitIndex21)<->$F)& (v2101(VarCurr,bitIndex20)<->$F)& (v2101(VarCurr,bitIndex19)<->$T)& (v2101(VarCurr,bitIndex18)<->$T)& (v2101(VarCurr,bitIndex17)<->$F)& (v2101(VarCurr,bitIndex16)<->$F)& (v2101(VarCurr,bitIndex15)<->$F)& (v2101(VarCurr,bitIndex14)<->$T)& (v2101(VarCurr,bitIndex13)<->$T)& (v2101(VarCurr,bitIndex12)<->$F)& (v2101(VarCurr,bitIndex11)<->$T)& (v2101(VarCurr,bitIndex10)<->$F)& (v2101(VarCurr,bitIndex9)<->$F)& (v2101(VarCurr,bitIndex8)<->$F)& (v2101(VarCurr,bitIndex7)<->$F)& (v2101(VarCurr,bitIndex6)<->$F)& (v2101(VarCurr,bitIndex5)<->$F)& (v2101(VarCurr,bitIndex4)<->$F)& (v2101(VarCurr,bitIndex3)<->$F)& (v2101(VarCurr,bitIndex2)<->$F)& (v2101(VarCurr,bitIndex1)<->$T)& (v2101(VarCurr,bitIndex0)<->$T)).
% 94.00/93.40  -b000000011000110100000000011(bitIndex26).
% 94.00/93.40  -b000000011000110100000000011(bitIndex25).
% 94.00/93.40  -b000000011000110100000000011(bitIndex24).
% 94.00/93.40  -b000000011000110100000000011(bitIndex23).
% 94.00/93.40  -b000000011000110100000000011(bitIndex22).
% 94.00/93.40  -b000000011000110100000000011(bitIndex21).
% 94.00/93.40  -b000000011000110100000000011(bitIndex20).
% 94.00/93.40  b000000011000110100000000011(bitIndex19).
% 94.00/93.40  b000000011000110100000000011(bitIndex18).
% 94.00/93.40  -b000000011000110100000000011(bitIndex17).
% 94.00/93.40  -b000000011000110100000000011(bitIndex16).
% 94.00/93.40  -b000000011000110100000000011(bitIndex15).
% 94.00/93.40  b000000011000110100000000011(bitIndex14).
% 94.00/93.40  b000000011000110100000000011(bitIndex13).
% 94.00/93.40  -b000000011000110100000000011(bitIndex12).
% 94.00/93.40  b000000011000110100000000011(bitIndex11).
% 94.00/93.40  -b000000011000110100000000011(bitIndex10).
% 94.00/93.40  -b000000011000110100000000011(bitIndex9).
% 94.00/93.40  -b000000011000110100000000011(bitIndex8).
% 94.00/93.40  -b000000011000110100000000011(bitIndex7).
% 94.00/93.40  -b000000011000110100000000011(bitIndex6).
% 94.00/93.40  -b000000011000110100000000011(bitIndex5).
% 94.00/93.40  -b000000011000110100000000011(bitIndex4).
% 94.00/93.40  -b000000011000110100000000011(bitIndex3).
% 94.00/93.40  -b000000011000110100000000011(bitIndex2).
% 94.00/93.40  b000000011000110100000000011(bitIndex1).
% 94.00/93.40  b000000011000110100000000011(bitIndex0).
% 94.00/93.40  all VarCurr (v2130(VarCurr)<-> (v2091(VarCurr)<->$F)).
% 94.00/93.40  all VarCurr (-v2128(VarCurr)<->v2067(VarCurr)).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2101(VarCurr,B)<->v2103(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2103(VarCurr,B)<->v2105(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2105(VarCurr,B)<->v2107(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2107(VarCurr,B)<->v2109(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2109(VarCurr,B)<->v2111(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2111(VarCurr,B)<->v2113(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2113(VarCurr,B)<->v2115(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2115(VarCurr,B)<->v2117(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2117(VarCurr,B)<->v2119(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2119(VarCurr,B)<->v2121(VarCurr,B))).
% 94.00/93.40  all VarCurr B (range_26_0(B)-> (v2121(VarCurr,B)<->v2123(VarCurr,B))).
% 94.00/93.40  all B (range_26_0(B)-> (v2123(constB0,B)<->$F)).
% 94.00/93.40  all B (range_26_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B).
% 94.00/93.40  -b000000000000000000000000000(bitIndex26).
% 94.00/93.40  -b000000000000000000000000000(bitIndex25).
% 94.00/93.40  -b000000000000000000000000000(bitIndex24).
% 94.00/93.40  -b000000000000000000000000000(bitIndex23).
% 94.00/93.40  -b000000000000000000000000000(bitIndex22).
% 94.00/93.40  -b000000000000000000000000000(bitIndex21).
% 94.00/93.40  -b000000000000000000000000000(bitIndex20).
% 94.00/93.40  -b000000000000000000000000000(bitIndex19).
% 94.00/93.40  -b000000000000000000000000000(bitIndex18).
% 94.00/93.40  -b000000000000000000000000000(bitIndex17).
% 94.00/93.40  -b000000000000000000000000000(bitIndex16).
% 94.00/93.40  -b000000000000000000000000000(bitIndex15).
% 94.00/93.40  -b000000000000000000000000000(bitIndex14).
% 94.00/93.40  -b000000000000000000000000000(bitIndex13).
% 94.00/93.40  -b000000000000000000000000000(bitIndex12).
% 94.00/93.40  -b000000000000000000000000000(bitIndex11).
% 94.00/93.40  -b000000000000000000000000000(bitIndex10).
% 94.00/93.40  -b000000000000000000000000000(bitIndex9).
% 94.00/93.40  -b000000000000000000000000000(bitIndex8).
% 94.00/93.40  -b000000000000000000000000000(bitIndex7).
% 94.00/93.40  -b000000000000000000000000000(bitIndex6).
% 94.00/93.40  -b000000000000000000000000000(bitIndex5).
% 94.00/93.40  -b000000000000000000000000000(bitIndex4).
% 94.00/93.40  -b000000000000000000000000000(bitIndex3).
% 94.00/93.40  -b000000000000000000000000000(bitIndex2).
% 94.00/93.40  -b000000000000000000000000000(bitIndex1).
% 94.00/93.40  -b000000000000000000000000000(bitIndex0).
% 94.00/93.40  all VarCurr (v2091(VarCurr)<->v2093(VarCurr)).
% 94.00/93.40  all VarCurr (v2093(VarCurr)<->v2095(VarCurr)).
% 94.00/93.40  all VarCurr (v2095(VarCurr)<->v2097(VarCurr)).
% 94.00/93.40  all VarCurr (v2097(VarCurr)<->v2099(VarCurr)).
% 94.00/93.40  all VarCurr (v2067(VarCurr)<->v2069(VarCurr)).
% 94.00/93.40  all VarCurr (v2069(VarCurr)<->v2071(VarCurr)).
% 94.00/93.40  all VarCurr (v2071(VarCurr)<->v2073(VarCurr)).
% 94.00/93.40  all VarCurr (v2073(VarCurr)<->v2075(VarCurr)).
% 94.00/93.40  all VarCurr (v2075(VarCurr)<->v2077(VarCurr)).
% 94.00/93.40  all VarCurr (v2077(VarCurr)<->v2079(VarCurr)).
% 94.00/93.40  all VarCurr (v2079(VarCurr)<->v2081(VarCurr)).
% 94.00/93.40  all VarCurr (v2081(VarCurr)<->v2083(VarCurr)).
% 94.00/93.40  all VarCurr (v2083(VarCurr)<->v2085(VarCurr)).
% 94.00/93.40  all VarCurr (v2085(VarCurr)<->v2087(VarCurr)).
% 94.00/93.40  all VarCurr (v2087(VarCurr)<->v2089(VarCurr)).
% 94.00/93.40  v2089(constB0)<->$T.
% 94.00/93.40  all VarCurr (v2039(VarCurr)<->$F).
% 94.00/93.40  all VarCurr (v1964(VarCurr)<->v1966(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1966(VarCurr,bitIndex0)<->v1968(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1968(VarCurr,bitIndex0)<->v1970(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1970(VarCurr,bitIndex0)<->v1972(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1972(VarCurr,bitIndex0)<->v1974(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1974(VarCurr,bitIndex0)<->v1976(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1976(VarCurr,bitIndex0)<->v1978(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1978(VarCurr,bitIndex0)<->v1980(VarCurr,bitIndex0)).
% 94.00/93.40  all VarCurr (v1980(VarCurr,bitIndex0)<->v1982(VarCurr,bitIndex0)).
% 94.00/93.40  all VarNext (v1982(VarNext,bitIndex0)<->v2023(VarNext,bitIndex0)).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2024(VarNext)-> (all B (range_63_0(B)-> (v2023(VarNext,B)<->v1982(VarCurr,B)))))).
% 94.00/93.40  all VarNext (v2024(VarNext)-> (all B (range_63_0(B)-> (v2023(VarNext,B)<->v2034(VarNext,B))))).
% 94.00/93.40  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_63_0(B)-> (v2034(VarNext,B)<->v2032(VarCurr,B))))).
% 94.00/93.40  all VarCurr (-v2035(VarCurr)-> (all B (range_63_0(B)-> (v2032(VarCurr,B)<->v1987(VarCurr,B))))).
% 94.00/93.40  all VarCurr (v2035(VarCurr)-> (all B (range_63_0(B)-> (v2032(VarCurr,B)<->$F)))).
% 94.00/93.40  all B (range_63_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 94.00/93.40  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 94.00/93.41  -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 94.00/93.41  all VarCurr (-v2035(VarCurr)<->v1984(VarCurr)).
% 94.00/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2024(VarNext)<->v2025(VarNext))).
% 94.00/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2025(VarNext)<->v2026(VarNext)&v2013(VarNext))).
% 94.00/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v2026(VarNext)<->v2028(VarNext))).
% 94.00/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v2028(VarNext)<->v2013(VarCurr))).
% 94.00/93.41  -v1982(constB0,bitIndex1).
% 94.00/93.41  -v1982(constB0,bitIndex0).
% 94.00/93.41  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex1).
% 94.00/93.41  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex0).
% 94.03/93.41  all VarCurr (v2013(VarCurr)<->v2015(VarCurr)).
% 94.03/93.41  all VarCurr (v2015(VarCurr)<->v2017(VarCurr)).
% 94.03/93.41  all VarCurr (v2017(VarCurr)<->v2019(VarCurr)).
% 94.03/93.41  all VarCurr (v2019(VarCurr)<->v1(VarCurr)).
% 94.03/93.41  all VarCurr (v1987(VarCurr,bitIndex0)<->v1989(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1989(VarCurr,bitIndex0)<->v1991(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1991(VarCurr,bitIndex0)<->v1993(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1993(VarCurr,bitIndex0)<->v1995(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1995(VarCurr,bitIndex0)<->v1997(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1997(VarCurr,bitIndex0)<->v1999(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1999(VarCurr,bitIndex0)<->v2001(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v2001(VarCurr,bitIndex0)<->v2003(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v2003(VarCurr,bitIndex0)<->v2005(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v2005(VarCurr,bitIndex0)<->v2007(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v2007(VarCurr,bitIndex0)<->v2009(VarCurr,bitIndex0)).
% 94.03/93.41  -v2009(constB0,bitIndex1).
% 94.03/93.41  -v2009(constB0,bitIndex0).
% 94.03/93.41  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex1).
% 94.03/93.41  -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex0).
% 94.03/93.41  all VarCurr (v1984(VarCurr)<->v1948(VarCurr)).
% 94.03/93.41  all VarCurr (v1961(VarCurr)<->$F).
% 94.03/93.41  all VarCurr (v1959(VarCurr)<->$F).
% 94.03/93.41  all VarCurr (v1957(VarCurr)<->$T).
% 94.03/93.41  all VarCurr (v1955(VarCurr)<->$F).
% 94.03/93.41  all VarCurr (v1932(VarCurr)<->v1934(VarCurr)).
% 94.03/93.41  all VarCurr (-v1934(VarCurr)<->v1936(VarCurr)).
% 94.03/93.41  all VarCurr (v1936(VarCurr)<->v1938(VarCurr)).
% 94.03/93.41  all VarCurr (v1938(VarCurr)<->v1940(VarCurr)).
% 94.03/93.41  all VarCurr (v1940(VarCurr)<->v1942(VarCurr)).
% 94.03/93.41  all VarCurr (v1942(VarCurr)<->v1944(VarCurr)).
% 94.03/93.41  all VarCurr (v1944(VarCurr)<->v1946(VarCurr)).
% 94.03/93.41  all VarCurr (v1946(VarCurr)<->v1948(VarCurr)).
% 94.03/93.41  all VarCurr (v1948(VarCurr)<->v1950(VarCurr)).
% 94.03/93.41  all VarCurr (v1950(VarCurr)<->v1952(VarCurr)).
% 94.03/93.41  all VarCurr (v1952(VarCurr)<->v16(VarCurr)).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1887(VarNext)-> (v318(VarNext)<->v318(VarCurr)))).
% 94.03/93.41  all VarNext (v1887(VarNext)-> (v318(VarNext)<->v1903(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1903(VarNext)<->v1901(VarCurr))).
% 94.03/93.41  all VarCurr (-v1900(VarCurr)-> (v1901(VarCurr)<->v1904(VarCurr))).
% 94.03/93.41  all VarCurr (v1900(VarCurr)-> (v1901(VarCurr)<->$F)).
% 94.03/93.41  all VarCurr (-v320(VarCurr)-> (v1904(VarCurr)<->$T)).
% 94.03/93.41  all VarCurr (v320(VarCurr)-> (v1904(VarCurr)<->$F)).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1887(VarNext)<->v1888(VarNext)&v1897(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1897(VarNext)<->v1895(VarCurr))).
% 94.03/93.41  all VarCurr (v1895(VarCurr)<->v1898(VarCurr)|v1900(VarCurr)).
% 94.03/93.41  all VarCurr (-v1900(VarCurr)<->v12(VarCurr)).
% 94.03/93.41  all VarCurr (v1898(VarCurr)<->v1899(VarCurr)|v320(VarCurr)).
% 94.03/93.41  all VarCurr (v1899(VarCurr)<->v664(VarCurr)&v741(VarCurr)).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1888(VarNext)<->v1889(VarNext)&v288(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1889(VarNext)<->v1891(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1891(VarNext)<->v288(VarCurr))).
% 94.03/93.41  v318(constB0)<->$F.
% 94.03/93.41  all VarCurr (v741(VarCurr)<->v1882(VarCurr)&v875(VarCurr)).
% 94.03/93.41  all VarCurr (v1882(VarCurr)<->v1883(VarCurr)|v1884(VarCurr)).
% 94.03/93.41  all VarCurr (v1884(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$T)& (v743(VarCurr,bitIndex0)<->$T)).
% 94.03/93.41  -b0111(bitIndex3).
% 94.03/93.41  b0111(bitIndex2).
% 94.03/93.41  b0111(bitIndex1).
% 94.03/93.41  b0111(bitIndex0).
% 94.03/93.41  all VarCurr (v1883(VarCurr)<-> (v743(VarCurr,bitIndex3)<->$F)& (v743(VarCurr,bitIndex2)<->$T)& (v743(VarCurr,bitIndex1)<->$T)& (v743(VarCurr,bitIndex0)<->$F)).
% 94.03/93.41  all VarCurr ((v743(VarCurr,bitIndex3)<->v745(VarCurr,bitIndex66))& (v743(VarCurr,bitIndex2)<->v745(VarCurr,bitIndex65))& (v743(VarCurr,bitIndex1)<->v745(VarCurr,bitIndex64))& (v743(VarCurr,bitIndex0)<->v745(VarCurr,bitIndex63))).
% 94.03/93.41  all VarCurr B (range_66_63(B)-> (v745(VarCurr,B)<->v747(VarCurr,B))).
% 94.03/93.41  all VarCurr B (range_66_63(B)-> (v747(VarCurr,B)<->v867(VarCurr,B))).
% 94.03/93.41  all B (range_66_63(B)<->bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1869(VarNext)-> (all B (range_3_0(B)-> (v869(VarNext,B)<->v869(VarCurr,B)))))).
% 94.03/93.41  all VarNext (v1869(VarNext)-> (all B (range_3_0(B)-> (v869(VarNext,B)<->v1877(VarNext,B))))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1877(VarNext,B)<->v1875(VarCurr,B))))).
% 94.03/93.41  all VarCurr (-v830(VarCurr)-> (all B (range_3_0(B)-> (v1875(VarCurr,B)<->v871(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v830(VarCurr)-> (all B (range_3_0(B)-> (v1875(VarCurr,B)<->$F)))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1869(VarNext)<->v1870(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1870(VarNext)<->v1872(VarNext)&v751(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1872(VarNext)<->v823(VarNext))).
% 94.03/93.41  all VarCurr (-v873(VarCurr)-> (all B (range_3_0(B)-> (v871(VarCurr,B)<->v869(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v873(VarCurr)-> (all B (range_3_0(B)-> (v871(VarCurr,B)<->v1846(VarCurr,B))))).
% 94.03/93.41  all VarCurr (-v1847(VarCurr)-> (all B (range_3_0(B)-> (v1846(VarCurr,B)<->v1848(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v1847(VarCurr)-> (all B (range_3_0(B)-> (v1846(VarCurr,B)<->$F)))).
% 94.03/93.41  all VarCurr (v1848(VarCurr,bitIndex0)<->v1864(VarCurr)).
% 94.03/93.41  all VarCurr (v1848(VarCurr,bitIndex1)<->v1862(VarCurr)).
% 94.03/93.41  all VarCurr (v1848(VarCurr,bitIndex2)<->v1857(VarCurr)).
% 94.03/93.41  all VarCurr (v1848(VarCurr,bitIndex3)<->v1850(VarCurr)).
% 94.03/93.41  all VarCurr (v1862(VarCurr)<->v1863(VarCurr)&v1866(VarCurr)).
% 94.03/93.41  all VarCurr (v1866(VarCurr)<->v869(VarCurr,bitIndex0)|v869(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (v1863(VarCurr)<->v1864(VarCurr)|v1865(VarCurr)).
% 94.03/93.41  all VarCurr (-v1865(VarCurr)<->v869(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (-v1864(VarCurr)<->v869(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1857(VarCurr)<->v1858(VarCurr)&v1861(VarCurr)).
% 94.03/93.41  all VarCurr (v1861(VarCurr)<->v1854(VarCurr)|v869(VarCurr,bitIndex2)).
% 94.03/93.41  all VarCurr (v1858(VarCurr)<->v1859(VarCurr)|v1860(VarCurr)).
% 94.03/93.41  all VarCurr (-v1860(VarCurr)<->v869(VarCurr,bitIndex2)).
% 94.03/93.41  all VarCurr (-v1859(VarCurr)<->v1854(VarCurr)).
% 94.03/93.41  all VarCurr (v1850(VarCurr)<->v1851(VarCurr)&v1856(VarCurr)).
% 94.03/93.41  all VarCurr (v1856(VarCurr)<->v1853(VarCurr)|v869(VarCurr,bitIndex3)).
% 94.03/93.41  all VarCurr (v1851(VarCurr)<->v1852(VarCurr)|v1855(VarCurr)).
% 94.03/93.41  all VarCurr (-v1855(VarCurr)<->v869(VarCurr,bitIndex3)).
% 94.03/93.41  all VarCurr (-v1852(VarCurr)<->v1853(VarCurr)).
% 94.03/93.41  all VarCurr (v1853(VarCurr)<->v1854(VarCurr)&v869(VarCurr,bitIndex2)).
% 94.03/93.41  all VarCurr (v1854(VarCurr)<->v869(VarCurr,bitIndex0)&v869(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (v1847(VarCurr)<-> (v869(VarCurr,bitIndex3)<->$T)& (v869(VarCurr,bitIndex2)<->$T)& (v869(VarCurr,bitIndex1)<->$T)& (v869(VarCurr,bitIndex0)<->$T)).
% 94.03/93.41  all VarCurr (v873(VarCurr)<->v875(VarCurr)).
% 94.03/93.41  all VarCurr (v875(VarCurr)<->v877(VarCurr)).
% 94.03/93.41  all VarCurr (v877(VarCurr)<->v879(VarCurr)|v1843(VarCurr)).
% 94.03/93.41  all VarCurr (v1843(VarCurr)<->v31(VarCurr,bitIndex4)).
% 94.03/93.41  all VarCurr (v879(VarCurr)<->v36(VarCurr,bitIndex6)).
% 94.03/93.41  all VarCurr (-v1831(VarCurr)-> (v36(VarCurr,bitIndex6)<->$F)).
% 94.03/93.41  all VarCurr (v1831(VarCurr)-> (v36(VarCurr,bitIndex6)<->$T)).
% 94.03/93.41  all VarCurr (v1831(VarCurr)<->v1832(VarCurr)|v1840(VarCurr)).
% 94.03/93.41  all VarCurr (v1840(VarCurr)<->v1841(VarCurr)&v1821(VarCurr)).
% 94.03/93.41  all VarCurr (-v1841(VarCurr)<->v38(VarCurr)).
% 94.03/93.41  all VarCurr (v1832(VarCurr)<->v1833(VarCurr)|v1838(VarCurr)).
% 94.03/93.41  all VarCurr (v1838(VarCurr)<->v1839(VarCurr)&v1360(VarCurr)).
% 94.03/93.41  all VarCurr (v1839(VarCurr)<->v1342(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (v1833(VarCurr)<->v1834(VarCurr)|v1836(VarCurr)).
% 94.03/93.41  all VarCurr (v1836(VarCurr)<->v1837(VarCurr)&v1355(VarCurr)).
% 94.03/93.41  all VarCurr (v1837(VarCurr)<->v1342(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (v1834(VarCurr)<->v1835(VarCurr)&v1348(VarCurr)).
% 94.03/93.41  all VarCurr (v1835(VarCurr)<->v1342(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarNext (v31(VarNext,bitIndex11)<->v1823(VarNext,bitIndex10)).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1824(VarNext)-> (v1823(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1823(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1823(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1823(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1823(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1823(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1823(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1823(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1823(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1823(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1823(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.03/93.41  all VarNext (v1824(VarNext)-> (all B (range_10_0(B)-> (v1823(VarNext,B)<->v1253(VarNext,B))))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1824(VarNext)<->v1825(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1825(VarNext)<->v1827(VarNext)&v1240(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1827(VarNext)<->v1247(VarNext))).
% 94.03/93.41  all VarCurr (-v1805(VarCurr)-> (v36(VarCurr,bitIndex11)<->$F)).
% 94.03/93.41  all VarCurr (v1805(VarCurr)-> (v36(VarCurr,bitIndex11)<->$T)).
% 94.03/93.41  all VarCurr (v1805(VarCurr)<->v1806(VarCurr)|v1820(VarCurr)).
% 94.03/93.41  all VarCurr (v1820(VarCurr)<->v38(VarCurr)&v1821(VarCurr)).
% 94.03/93.41  all VarCurr (v1821(VarCurr)<-> ($T<->v31(VarCurr,bitIndex11))).
% 94.03/93.41  all VarCurr (v1806(VarCurr)<->v1807(VarCurr)|v1817(VarCurr)).
% 94.03/93.41  all VarCurr (v1817(VarCurr)<->v1818(VarCurr)&v1323(VarCurr)).
% 94.03/93.41  all VarCurr (v1818(VarCurr)<->v1342(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (v1807(VarCurr)<->v1808(VarCurr)|v1815(VarCurr)).
% 94.03/93.41  all VarCurr (v1815(VarCurr)<->v1816(VarCurr)&v1300(VarCurr)).
% 94.03/93.41  all VarCurr (v1816(VarCurr)<->v1352(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (v1808(VarCurr)<->v1809(VarCurr)|v1813(VarCurr)).
% 94.03/93.41  all VarCurr (v1813(VarCurr)<->v1814(VarCurr)&v1278(VarCurr)).
% 94.03/93.41  all VarCurr (v1814(VarCurr)<->v1352(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (v1809(VarCurr)<->v1810(VarCurr)&v1238(VarCurr)).
% 94.03/93.41  all VarCurr (v1810(VarCurr)<->v1352(VarCurr)&v1812(VarCurr)).
% 94.03/93.41  all VarCurr (-v1812(VarCurr)<->v1168(VarCurr)).
% 94.03/93.41  all VarCurr (v907(VarCurr)<->v909(VarCurr)&v1150(VarCurr)).
% 94.03/93.41  all VarCurr (v909(VarCurr)<->v911(VarCurr)).
% 94.03/93.41  all VarCurr (v911(VarCurr)<->v913(VarCurr)).
% 94.03/93.41  all VarCurr (v913(VarCurr)<->v1799(VarCurr)&v1800(VarCurr)).
% 94.03/93.41  all VarCurr (-v1800(VarCurr)<->v1138(VarCurr)).
% 94.03/93.41  all VarCurr (-v1799(VarCurr)<->v915(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (v915(VarCurr,bitIndex1)<->v917(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (v917(VarCurr,bitIndex1)<->v919(VarCurr,bitIndex17)).
% 94.03/93.41  all VarCurr (v919(VarCurr,bitIndex17)<->v921(VarCurr,bitIndex17)).
% 94.03/93.41  all VarCurr (v921(VarCurr,bitIndex17)<->v1017(VarCurr,bitIndex17)).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1787(VarNext)-> (all B (range_3_0(B)-> (v1019(VarNext,B)<->v1019(VarCurr,B)))))).
% 94.03/93.41  all VarNext (v1787(VarNext)-> (all B (range_3_0(B)-> (v1019(VarNext,B)<->v1795(VarNext,B))))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1795(VarNext,B)<->v1793(VarCurr,B))))).
% 94.03/93.41  all VarCurr (-v991(VarCurr)-> (all B (range_3_0(B)-> (v1793(VarCurr,B)<->v1021(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v991(VarCurr)-> (all B (range_3_0(B)-> (v1793(VarCurr,B)<->$F)))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1787(VarNext)<->v1788(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1788(VarNext)<->v1790(VarNext)&v925(VarNext))).
% 94.03/93.41  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1790(VarNext)<->v984(VarNext))).
% 94.03/93.41  all VarCurr (-v1023(VarCurr)-> (all B (range_3_0(B)-> (v1021(VarCurr,B)<->v1019(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v1023(VarCurr)-> (all B (range_3_0(B)-> (v1021(VarCurr,B)<->v1764(VarCurr,B))))).
% 94.03/93.41  all VarCurr (-v1765(VarCurr)-> (all B (range_3_0(B)-> (v1764(VarCurr,B)<->v1766(VarCurr,B))))).
% 94.03/93.41  all VarCurr (v1765(VarCurr)-> (all B (range_3_0(B)-> (v1764(VarCurr,B)<->$F)))).
% 94.03/93.41  all VarCurr (v1766(VarCurr,bitIndex0)<->v1782(VarCurr)).
% 94.03/93.41  all VarCurr (v1766(VarCurr,bitIndex1)<->v1780(VarCurr)).
% 94.03/93.41  all VarCurr (v1766(VarCurr,bitIndex2)<->v1775(VarCurr)).
% 94.03/93.41  all VarCurr (v1766(VarCurr,bitIndex3)<->v1768(VarCurr)).
% 94.03/93.41  all VarCurr (v1780(VarCurr)<->v1781(VarCurr)&v1784(VarCurr)).
% 94.03/93.41  all VarCurr (v1784(VarCurr)<->v1019(VarCurr,bitIndex0)|v1019(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (v1781(VarCurr)<->v1782(VarCurr)|v1783(VarCurr)).
% 94.03/93.41  all VarCurr (-v1783(VarCurr)<->v1019(VarCurr,bitIndex1)).
% 94.03/93.41  all VarCurr (-v1782(VarCurr)<->v1019(VarCurr,bitIndex0)).
% 94.03/93.41  all VarCurr (v1775(VarCurr)<->v1776(VarCurr)&v1779(VarCurr)).
% 94.03/93.41  all VarCurr (v1779(VarCurr)<->v1772(VarCurr)|v1019(VarCurr,bitIndex2)).
% 94.03/93.41  all VarCurr (v1776(VarCurr)<->v1777(VarCurr)|v1778(VarCurr)).
% 94.03/93.41  all VarCurr (-v1778(VarCurr)<->v1019(VarCurr,bitIndex2)).
% 94.03/93.42  all VarCurr (-v1777(VarCurr)<->v1772(VarCurr)).
% 94.03/93.42  all VarCurr (v1768(VarCurr)<->v1769(VarCurr)&v1774(VarCurr)).
% 94.03/93.42  all VarCurr (v1774(VarCurr)<->v1771(VarCurr)|v1019(VarCurr,bitIndex3)).
% 94.03/93.42  all VarCurr (v1769(VarCurr)<->v1770(VarCurr)|v1773(VarCurr)).
% 94.03/93.42  all VarCurr (-v1773(VarCurr)<->v1019(VarCurr,bitIndex3)).
% 94.03/93.42  all VarCurr (-v1770(VarCurr)<->v1771(VarCurr)).
% 94.03/93.42  all VarCurr (v1771(VarCurr)<->v1772(VarCurr)&v1019(VarCurr,bitIndex2)).
% 94.03/93.42  all VarCurr (v1772(VarCurr)<->v1019(VarCurr,bitIndex0)&v1019(VarCurr,bitIndex1)).
% 94.03/93.42  all VarCurr (v1765(VarCurr)<-> (v1019(VarCurr,bitIndex3)<->$T)& (v1019(VarCurr,bitIndex2)<->$T)& (v1019(VarCurr,bitIndex1)<->$T)& (v1019(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1023(VarCurr)<->v1025(VarCurr)).
% 94.03/93.42  all VarCurr (v1025(VarCurr)<->v1027(VarCurr)).
% 94.03/93.42  all VarCurr (v1027(VarCurr)<->v1761(VarCurr)|v1160(VarCurr)).
% 94.03/93.42  all VarCurr (v1761(VarCurr)<->v1762(VarCurr)|v85(VarCurr)).
% 94.03/93.42  all VarCurr (v1762(VarCurr)<->v1029(VarCurr)|v1148(VarCurr)).
% 94.03/93.42  all VarCurr (v1160(VarCurr)<->v31(VarCurr,bitIndex1)).
% 94.03/93.42  all VarNext (v31(VarNext,bitIndex1)<->v1753(VarNext,bitIndex0)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1754(VarNext)-> (v1753(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1753(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1753(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1753(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1753(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1753(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1753(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1753(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1753(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1753(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1753(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.03/93.42  all VarNext (v1754(VarNext)-> (all B (range_10_0(B)-> (v1753(VarNext,B)<->v1253(VarNext,B))))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1754(VarNext)<->v1755(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1755(VarNext)<->v1757(VarNext)&v1240(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1757(VarNext)<->v1247(VarNext))).
% 94.03/93.42  all VarCurr (-v1730(VarCurr)-> (v36(VarCurr,bitIndex1)<->$F)).
% 94.03/93.42  all VarCurr (v1730(VarCurr)-> (v36(VarCurr,bitIndex1)<->$T)).
% 94.03/93.42  all VarCurr (v1730(VarCurr)<->v1731(VarCurr)|v1750(VarCurr)).
% 94.03/93.42  all VarCurr (v1750(VarCurr)<->v1751(VarCurr)&v1323(VarCurr)).
% 94.03/93.42  all VarCurr (v1751(VarCurr)<->v1677(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1731(VarCurr)<->v1732(VarCurr)|v1748(VarCurr)).
% 94.03/93.42  all VarCurr (v1748(VarCurr)<->v1749(VarCurr)&v1300(VarCurr)).
% 94.03/93.42  all VarCurr (v1749(VarCurr)<->v1689(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1732(VarCurr)<->v1733(VarCurr)|v1746(VarCurr)).
% 94.03/93.42  all VarCurr (v1746(VarCurr)<->v1747(VarCurr)&v1360(VarCurr)).
% 94.03/93.42  all VarCurr (v1747(VarCurr)<->v1677(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1733(VarCurr)<->v1734(VarCurr)|v1744(VarCurr)).
% 94.03/93.42  all VarCurr (v1744(VarCurr)<->v1745(VarCurr)&v1278(VarCurr)).
% 94.03/93.42  all VarCurr (v1745(VarCurr)<->v1689(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1734(VarCurr)<->v1735(VarCurr)|v1742(VarCurr)).
% 94.03/93.42  all VarCurr (v1742(VarCurr)<->v1743(VarCurr)&v1355(VarCurr)).
% 94.03/93.42  all VarCurr (v1743(VarCurr)<->v1677(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1735(VarCurr)<->v1736(VarCurr)|v1739(VarCurr)).
% 94.03/93.42  all VarCurr (v1739(VarCurr)<->v1740(VarCurr)&v1238(VarCurr)).
% 94.03/93.42  all VarCurr (v1740(VarCurr)<->v1689(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1736(VarCurr)<->v1737(VarCurr)&v1348(VarCurr)).
% 94.03/93.42  all VarCurr (v1737(VarCurr)<->v1677(VarCurr)&v907(VarCurr)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1717(VarNext)-> (v31(VarNext,bitIndex0)<->v31(VarCurr,bitIndex0)))).
% 94.03/93.42  all VarNext (v1717(VarNext)-> (v31(VarNext,bitIndex0)<->v1725(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1725(VarNext)<->v1723(VarCurr))).
% 94.03/93.42  all VarCurr (-v1254(VarCurr)-> (v1723(VarCurr)<->v36(VarCurr,bitIndex0))).
% 94.03/93.42  all VarCurr (v1254(VarCurr)-> (v1723(VarCurr)<->$T)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1717(VarNext)<->v1718(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1718(VarNext)<->v1720(VarNext)&v1240(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1720(VarNext)<->v1247(VarNext))).
% 94.03/93.42  all VarCurr (-v1660(VarCurr)-> (v36(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  all VarCurr (v1660(VarCurr)-> (v36(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1660(VarCurr)<->v1661(VarCurr)|v1711(VarCurr)).
% 94.03/93.42  all VarCurr (v1711(VarCurr)<->v1712(VarCurr)&v1323(VarCurr)).
% 94.03/93.42  all VarCurr (v1712(VarCurr)<->v1713(VarCurr)|v1714(VarCurr)).
% 94.03/93.42  all VarCurr (v1714(VarCurr)<->v1677(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1713(VarCurr)<->v1671(VarCurr)).
% 94.03/93.42  all VarCurr (v1661(VarCurr)<->v1662(VarCurr)|v1707(VarCurr)).
% 94.03/93.42  all VarCurr (v1707(VarCurr)<->v1708(VarCurr)&v1300(VarCurr)).
% 94.03/93.42  all VarCurr (v1708(VarCurr)<->v1709(VarCurr)|v1710(VarCurr)).
% 94.03/93.42  all VarCurr (v1710(VarCurr)<->v1689(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1709(VarCurr)<->v1671(VarCurr)&v1180(VarCurr)).
% 94.03/93.42  all VarCurr (v1662(VarCurr)<->v1663(VarCurr)|v1701(VarCurr)).
% 94.03/93.42  all VarCurr (v1701(VarCurr)<->v1702(VarCurr)&v1360(VarCurr)).
% 94.03/93.42  all VarCurr (v1702(VarCurr)<->v1703(VarCurr)|v1706(VarCurr)).
% 94.03/93.42  all VarCurr (v1706(VarCurr)<->v1677(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1703(VarCurr)<->v1704(VarCurr)|v1705(VarCurr)).
% 94.03/93.42  all VarCurr (v1705(VarCurr)<->v1671(VarCurr)).
% 94.03/93.42  all VarCurr (v1704(VarCurr)<->v38(VarCurr)).
% 94.03/93.42  all VarCurr (v1663(VarCurr)<->v1664(VarCurr)|v1697(VarCurr)).
% 94.03/93.42  all VarCurr (v1697(VarCurr)<->v1698(VarCurr)&v1278(VarCurr)).
% 94.03/93.42  all VarCurr (v1698(VarCurr)<->v1699(VarCurr)|v1700(VarCurr)).
% 94.03/93.42  all VarCurr (v1700(VarCurr)<->v1689(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1699(VarCurr)<->v1671(VarCurr)&v1180(VarCurr)).
% 94.03/93.42  all VarCurr (v1664(VarCurr)<->v1665(VarCurr)|v1691(VarCurr)).
% 94.03/93.42  all VarCurr (v1691(VarCurr)<->v1692(VarCurr)&v1355(VarCurr)).
% 94.03/93.42  all VarCurr (v1692(VarCurr)<->v1693(VarCurr)|v1696(VarCurr)).
% 94.03/93.42  all VarCurr (v1696(VarCurr)<->v1677(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1693(VarCurr)<->v1694(VarCurr)|v1695(VarCurr)).
% 94.03/93.42  all VarCurr (v1695(VarCurr)<->v1671(VarCurr)).
% 94.03/93.42  all VarCurr (v1694(VarCurr)<->v38(VarCurr)).
% 94.03/93.42  all VarCurr (v1665(VarCurr)<->v1666(VarCurr)|v1683(VarCurr)).
% 94.03/93.42  all VarCurr (v1683(VarCurr)<->v1684(VarCurr)&v1238(VarCurr)).
% 94.03/93.42  all VarCurr (v1684(VarCurr)<->v1685(VarCurr)|v1687(VarCurr)).
% 94.03/93.42  all VarCurr (v1687(VarCurr)<->v1689(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (v1689(VarCurr)<->v1690(VarCurr)&v1681(VarCurr)).
% 94.03/93.42  all VarCurr (v1690(VarCurr)<->v1678(VarCurr)&v1180(VarCurr)).
% 94.03/93.42  all VarCurr (v1685(VarCurr)<->v1671(VarCurr)&v1180(VarCurr)).
% 94.03/93.42  all VarCurr (v1671(VarCurr)<->v1672(VarCurr)&v1347(VarCurr)).
% 94.03/93.42  all VarCurr (v1666(VarCurr)<->v1667(VarCurr)&v1348(VarCurr)).
% 94.03/93.42  all VarCurr (v1667(VarCurr)<->v1668(VarCurr)|v1675(VarCurr)).
% 94.03/93.42  all VarCurr (v1675(VarCurr)<->v1677(VarCurr)&v1682(VarCurr)).
% 94.03/93.42  all VarCurr (-v1682(VarCurr)<->v907(VarCurr)).
% 94.03/93.42  all VarCurr (v1677(VarCurr)<->v1678(VarCurr)&v1681(VarCurr)).
% 94.03/93.42  all VarCurr (-v1681(VarCurr)<->v1162(VarCurr)).
% 94.03/93.42  all VarCurr (v1678(VarCurr)<->v1679(VarCurr)&v1347(VarCurr)).
% 94.03/93.42  all VarCurr (v1679(VarCurr)<->v1680(VarCurr)&v1346(VarCurr)).
% 94.03/93.42  all VarCurr (v1680(VarCurr)<->v87(VarCurr)&v1674(VarCurr)).
% 94.03/93.42  all VarCurr (v1668(VarCurr)<->v1669(VarCurr)|v1670(VarCurr)).
% 94.03/93.42  all VarCurr (v1670(VarCurr)<->v1672(VarCurr)&v1347(VarCurr)).
% 94.03/93.42  all VarCurr (v1672(VarCurr)<->v1673(VarCurr)&v1346(VarCurr)).
% 94.03/93.42  all VarCurr (v1673(VarCurr)<->v1345(VarCurr)&v1674(VarCurr)).
% 94.03/93.42  all VarCurr (-v1674(VarCurr)<->v881(VarCurr)).
% 94.03/93.42  all VarCurr (v1669(VarCurr)<->v38(VarCurr)).
% 94.03/93.42  all VarCurr (v1180(VarCurr)<->v1182(VarCurr)).
% 94.03/93.42  all VarCurr (v1182(VarCurr)<->v1184(VarCurr)).
% 94.03/93.42  all VarCurr (v1184(VarCurr)<->v1186(VarCurr)&v1656(VarCurr)).
% 94.03/93.42  all VarCurr (v1656(VarCurr)<->v1377(VarCurr,bitIndex2)|v1377(VarCurr,bitIndex4)).
% 94.03/93.42  all VarCurr (v1186(VarCurr)<->v1188(VarCurr)).
% 94.03/93.42  all VarCurr (v1188(VarCurr)<->v1190(VarCurr)).
% 94.03/93.42  all VarCurr (v1190(VarCurr)<->v1192(VarCurr)).
% 94.03/93.42  all VarCurr (v1192(VarCurr)<->v1194(VarCurr)).
% 94.03/93.42  all VarCurr (v1194(VarCurr)<->v1196(VarCurr)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1643(VarNext)-> (v1196(VarNext)<->v1196(VarCurr)))).
% 94.03/93.42  all VarNext (v1643(VarNext)-> (v1196(VarNext)<->v1651(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1651(VarNext)<->v1649(VarCurr))).
% 94.03/93.42  all VarCurr (-v1652(VarCurr)-> (v1649(VarCurr)<->v1202(VarCurr))).
% 94.03/93.42  all VarCurr (v1652(VarCurr)-> (v1649(VarCurr)<->$F)).
% 94.03/93.42  all VarCurr (-v1652(VarCurr)<->v1198(VarCurr)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1643(VarNext)<->v1644(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1644(VarNext)<->v1645(VarNext)&v1540(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1645(VarNext)<->v1549(VarNext))).
% 94.03/93.42  all VarCurr (-v1602(VarCurr)-> (v1202(VarCurr)<->$F)).
% 94.03/93.42  all VarCurr (v1602(VarCurr)-> (v1202(VarCurr)<->v1626(VarCurr))).
% 94.03/93.42  all VarCurr (-v1563(VarCurr)-> (v1626(VarCurr)<->$F)).
% 94.03/93.42  all VarCurr (v1563(VarCurr)-> (v1626(VarCurr)<->v1627(VarCurr))).
% 94.03/93.42  all VarCurr (v1633(VarCurr)<->v1635(VarCurr)|v1615(VarCurr)).
% 94.03/93.42  all VarCurr (v1635(VarCurr)<->v1636(VarCurr)|v1614(VarCurr)).
% 94.03/93.42  all VarCurr (v1636(VarCurr)<->v1637(VarCurr)|v1613(VarCurr)).
% 94.03/93.42  all VarCurr (v1637(VarCurr)<->v1638(VarCurr)|v1583(VarCurr)).
% 94.03/93.42  all VarCurr (v1638(VarCurr)<->v1639(VarCurr)|v1582(VarCurr)).
% 94.03/93.42  all VarCurr (v1639(VarCurr)<->v1640(VarCurr)|v1581(VarCurr)).
% 94.03/93.42  all VarCurr (v1640(VarCurr)<->v1566(VarCurr)|v1580(VarCurr)).
% 94.03/93.42  all VarCurr (v1566(VarCurr)<->v1567(VarCurr)|v1572(VarCurr)).
% 94.03/93.42  all VarCurr (-v1208(VarCurr)-> (v1627(VarCurr)<->$F)).
% 94.03/93.42  all VarCurr (v1208(VarCurr)-> (v1627(VarCurr)<->v1628(VarCurr))).
% 94.03/93.42  all VarCurr (-v1629(VarCurr)-> (v1628(VarCurr)<->$T)).
% 94.03/93.42  all VarCurr (v1629(VarCurr)-> (v1628(VarCurr)<->$F)).
% 94.03/93.42  all VarCurr (v1629(VarCurr)<->v1630(VarCurr)&v1538(VarCurr)).
% 94.03/93.42  all VarCurr (v1630(VarCurr)<->v1631(VarCurr)|v1632(VarCurr)).
% 94.03/93.42  all VarCurr (v1632(VarCurr)<-> (v1497(VarCurr,bitIndex3)<->$T)& (v1497(VarCurr,bitIndex2)<->$T)& (v1497(VarCurr,bitIndex1)<->$F)& (v1497(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1631(VarCurr)<-> (v1497(VarCurr,bitIndex3)<->$F)& (v1497(VarCurr,bitIndex2)<->$T)& (v1497(VarCurr,bitIndex1)<->$F)& (v1497(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1602(VarCurr)<->v1603(VarCurr)|v1615(VarCurr)).
% 94.03/93.42  all VarCurr (-v1615(VarCurr)<->v1616(VarCurr)).
% 94.03/93.42  all VarCurr (v1616(VarCurr)<->v1617(VarCurr)|v1584(VarCurr)).
% 94.03/93.42  all VarCurr (v1617(VarCurr)<->v1618(VarCurr)|v1583(VarCurr)).
% 94.03/93.42  all VarCurr (v1618(VarCurr)<->v1619(VarCurr)|v1582(VarCurr)).
% 94.03/93.42  all VarCurr (v1619(VarCurr)<->v1620(VarCurr)|v1581(VarCurr)).
% 94.03/93.42  all VarCurr (v1620(VarCurr)<->v1621(VarCurr)|v1580(VarCurr)).
% 94.03/93.42  all VarCurr (v1621(VarCurr)<->v1622(VarCurr)|v1573(VarCurr)).
% 94.03/93.42  all VarCurr (v1622(VarCurr)<->v1623(VarCurr)|v1572(VarCurr)).
% 94.03/93.42  all VarCurr (v1623(VarCurr)<->v1624(VarCurr)|v1571(VarCurr)).
% 94.03/93.42  all VarCurr (v1624(VarCurr)<->v1625(VarCurr)|v1570(VarCurr)).
% 94.03/93.42  all VarCurr (v1625(VarCurr)<->v1563(VarCurr)|v1569(VarCurr)).
% 94.03/93.42  all VarCurr (v1603(VarCurr)<->v1604(VarCurr)|v1614(VarCurr)).
% 94.03/93.42  all VarCurr (v1614(VarCurr)<->v1586(VarCurr)&v1584(VarCurr)).
% 94.03/93.42  all VarCurr (v1604(VarCurr)<->v1605(VarCurr)|v1583(VarCurr)).
% 94.03/93.42  all VarCurr (v1605(VarCurr)<->v1606(VarCurr)|v1582(VarCurr)).
% 94.03/93.42  all VarCurr (v1606(VarCurr)<->v1607(VarCurr)|v1581(VarCurr)).
% 94.03/93.42  all VarCurr (v1607(VarCurr)<->v1608(VarCurr)|v1580(VarCurr)).
% 94.03/93.42  all VarCurr (v1608(VarCurr)<->v1609(VarCurr)|v1613(VarCurr)).
% 94.03/93.42  all VarCurr (v1613(VarCurr)<->v1575(VarCurr)&v1573(VarCurr)).
% 94.03/93.42  all VarCurr (v1609(VarCurr)<->v1610(VarCurr)|v1572(VarCurr)).
% 94.03/93.42  all VarCurr (v1610(VarCurr)<->v1611(VarCurr)|v1571(VarCurr)).
% 94.03/93.42  all VarCurr (v1611(VarCurr)<->v1612(VarCurr)|v1570(VarCurr)).
% 94.03/93.42  all VarCurr (v1612(VarCurr)<->v1563(VarCurr)|v1569(VarCurr)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1589(VarNext)-> (all B (range_3_0(B)-> (v1204(VarNext,B)<->v1204(VarCurr,B)))))).
% 94.03/93.42  all VarNext (v1589(VarNext)-> (all B (range_3_0(B)-> (v1204(VarNext,B)<->v1597(VarNext,B))))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1597(VarNext,B)<->v1595(VarCurr,B))))).
% 94.03/93.42  all VarCurr (-v1598(VarCurr)-> (all B (range_3_0(B)-> (v1595(VarCurr,B)<->v1206(VarCurr,B))))).
% 94.03/93.42  all VarCurr (v1598(VarCurr)-> (all B (range_3_0(B)-> (v1595(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (-v1598(VarCurr)<->v1198(VarCurr)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1589(VarNext)<->v1590(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1590(VarNext)<->v1591(VarNext)&v1540(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1591(VarNext)<->v1549(VarNext))).
% 94.03/93.42  all VarCurr (-v1563(VarCurr)& -v1565(VarCurr)& -v1573(VarCurr)& -v1576(VarCurr)& -v1584(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1584(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->v1585(VarCurr,B))))).
% 94.03/93.42  all VarCurr (v1576(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1573(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->v1574(VarCurr,B))))).
% 94.03/93.42  all VarCurr (v1565(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1563(VarCurr)-> (all B (range_3_0(B)-> (v1206(VarCurr,B)<->v1564(VarCurr,B))))).
% 94.03/93.42  all VarCurr (-v1586(VarCurr)-> (all B (range_3_0(B)-> (v1585(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1586(VarCurr)-> (all B (range_3_0(B)-> (v1585(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (-v1586(VarCurr)<->v1536(VarCurr)).
% 94.03/93.42  all VarCurr (v1584(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$T)& (v1204(VarCurr,bitIndex2)<->$T)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  b1101(bitIndex3).
% 94.03/93.42  b1101(bitIndex2).
% 94.03/93.42  -b1101(bitIndex1).
% 94.03/93.42  b1101(bitIndex0).
% 94.03/93.42  all VarCurr (v1576(VarCurr)<->v1578(VarCurr)|v1583(VarCurr)).
% 94.03/93.42  all VarCurr (v1583(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$T)& (v1204(VarCurr,bitIndex2)<->$T)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  all VarCurr (v1578(VarCurr)<->v1579(VarCurr)|v1582(VarCurr)).
% 94.03/93.42  all VarCurr (v1582(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$T)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$T)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  b1011(bitIndex3).
% 94.03/93.42  -b1011(bitIndex2).
% 94.03/93.42  b1011(bitIndex1).
% 94.03/93.42  b1011(bitIndex0).
% 94.03/93.42  all VarCurr (v1579(VarCurr)<->v1580(VarCurr)|v1581(VarCurr)).
% 94.03/93.42  all VarCurr (v1581(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$T)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$T)& (v1204(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  b1010(bitIndex3).
% 94.03/93.42  -b1010(bitIndex2).
% 94.03/93.42  b1010(bitIndex1).
% 94.03/93.42  -b1010(bitIndex0).
% 94.03/93.42  all VarCurr (v1580(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$T)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  b1001(bitIndex3).
% 94.03/93.42  -b1001(bitIndex2).
% 94.03/93.42  -b1001(bitIndex1).
% 94.03/93.42  b1001(bitIndex0).
% 94.03/93.42  all VarCurr (-v1575(VarCurr)-> (all B (range_3_0(B)-> (v1574(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1575(VarCurr)-> (all B (range_3_0(B)-> (v1574(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (-v1575(VarCurr)<->v1536(VarCurr)).
% 94.03/93.42  all VarCurr (v1573(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$T)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1565(VarCurr)<->v1567(VarCurr)|v1572(VarCurr)).
% 94.03/93.42  all VarCurr (v1572(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$T)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  -b0100(bitIndex3).
% 94.03/93.42  b0100(bitIndex2).
% 94.03/93.42  -b0100(bitIndex1).
% 94.03/93.42  -b0100(bitIndex0).
% 94.03/93.42  all VarCurr (v1567(VarCurr)<->v1568(VarCurr)|v1571(VarCurr)).
% 94.03/93.42  all VarCurr (v1571(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$T)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (v1568(VarCurr)<->v1569(VarCurr)|v1570(VarCurr)).
% 94.03/93.42  all VarCurr (v1570(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$T)& (v1204(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  -b0010(bitIndex3).
% 94.03/93.42  -b0010(bitIndex2).
% 94.03/93.42  b0010(bitIndex1).
% 94.03/93.42  -b0010(bitIndex0).
% 94.03/93.42  all VarCurr (v1569(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$T)).
% 94.03/93.42  all VarCurr (-v1208(VarCurr)-> (all B (range_3_0(B)-> (v1564(VarCurr,B)<->$F)))).
% 94.03/93.42  all VarCurr (v1208(VarCurr)-> (all B (range_3_0(B)-> (v1564(VarCurr,B)<->v1497(VarCurr,B))))).
% 94.03/93.42  all VarCurr (v1563(VarCurr)<-> (v1204(VarCurr,bitIndex3)<->$F)& (v1204(VarCurr,bitIndex2)<->$F)& (v1204(VarCurr,bitIndex1)<->$F)& (v1204(VarCurr,bitIndex0)<->$F)).
% 94.03/93.42  all B (range_3_0(B)-> (v1204(constB0,B)<->$F)).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1545(VarNext)-> (v1536(VarNext)<->v1536(VarCurr)))).
% 94.03/93.42  all VarNext (v1545(VarNext)-> (v1536(VarNext)<->v1555(VarNext))).
% 94.03/93.42  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1555(VarNext)<->v1553(VarCurr))).
% 94.03/93.43  all VarCurr (-v1556(VarCurr)-> (v1553(VarCurr)<->v1538(VarCurr))).
% 94.03/93.43  all VarCurr (v1556(VarCurr)-> (v1553(VarCurr)<->$F)).
% 94.03/93.43  all VarCurr (-v1556(VarCurr)<->v1198(VarCurr)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1545(VarNext)<->v1546(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1546(VarNext)<->v1547(VarNext)&v1540(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1547(VarNext)<->v1549(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1549(VarNext)<->v1540(VarCurr))).
% 94.03/93.43  v1536(constB0)<->$F.
% 94.03/93.43  all VarCurr (v1540(VarCurr)<->v1542(VarCurr)).
% 94.03/93.43  all VarCurr (v1542(VarCurr)<->v1(VarCurr)).
% 94.03/93.43  all VarCurr (v1538(VarCurr)<->$F).
% 94.03/93.43  all VarCurr B (range_3_0(B)-> (v1497(VarCurr,B)<->v1499(VarCurr,B))).
% 94.03/93.43  all VarCurr B (range_3_0(B)-> (v1499(VarCurr,B)<->v1501(VarCurr,B))).
% 94.03/93.43  all VarCurr B (range_3_0(B)-> (v1501(VarCurr,B)<->v1503(VarCurr,B))).
% 94.03/93.43  all VarCurr B (range_3_0(B)-> (v1503(VarCurr,B)<->v1505(VarCurr,B))).
% 94.03/93.43  all VarCurr B (range_3_0(B)-> (v1505(VarCurr,B)<->v1507(VarCurr,B))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1512(VarNext)-> (all B (range_3_0(B)-> (v1507(VarNext,B)<->v1507(VarCurr,B)))))).
% 94.03/93.43  all VarNext (v1512(VarNext)-> (all B (range_3_0(B)-> (v1507(VarNext,B)<->v1529(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1529(VarNext,B)<->v1527(VarCurr,B))))).
% 94.03/93.43  all VarCurr (-v1521(VarCurr)-> (all B (range_3_0(B)-> (v1527(VarCurr,B)<->v1530(VarCurr,B))))).
% 94.03/93.43  all VarCurr (v1521(VarCurr)-> (all B (range_3_0(B)-> (v1527(VarCurr,B)<->$F)))).
% 94.03/93.43  all VarCurr (-v1222(VarCurr,bitIndex3)-> (all B (range_3_0(B)-> (v1530(VarCurr,B)<->b0011(B))))).
% 94.03/93.43  -b0011(bitIndex3).
% 94.03/93.43  -b0011(bitIndex2).
% 94.03/93.43  b0011(bitIndex1).
% 94.03/93.43  b0011(bitIndex0).
% 94.03/93.43  all VarCurr (v1222(VarCurr,bitIndex3)-> (all B (range_3_0(B)-> (v1530(VarCurr,B)<->b1100(B))))).
% 94.03/93.43  b1100(bitIndex3).
% 94.03/93.43  b1100(bitIndex2).
% 94.03/93.43  -b1100(bitIndex1).
% 94.03/93.43  -b1100(bitIndex0).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1512(VarNext)<->v1513(VarNext)&v1520(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1520(VarNext)<->v1518(VarCurr))).
% 94.03/93.43  all VarCurr (v1518(VarCurr)<->v1521(VarCurr)|v1522(VarCurr)).
% 94.03/93.43  all VarCurr (v1522(VarCurr)<->v1523(VarCurr)&v1526(VarCurr)).
% 94.03/93.43  all VarCurr (-v1526(VarCurr)<->v1521(VarCurr)).
% 94.03/93.43  all VarCurr (v1523(VarCurr)<->v1222(VarCurr,bitIndex3)|v1524(VarCurr)).
% 94.03/93.43  all VarCurr (v1524(VarCurr)<->v1222(VarCurr,bitIndex1)&v1525(VarCurr)).
% 94.03/93.43  all VarCurr (-v1525(VarCurr)<->v1222(VarCurr,bitIndex3)).
% 94.03/93.43  all VarCurr (-v1521(VarCurr)<->v1220(VarCurr)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1513(VarNext)<->v1514(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1514(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarCurr (v1208(VarCurr)<->v1210(VarCurr)).
% 94.03/93.43  all VarCurr (v1210(VarCurr)<->v1212(VarCurr)).
% 94.03/93.43  all VarCurr (v1212(VarCurr)<->v1214(VarCurr)).
% 94.03/93.43  all VarCurr (v1214(VarCurr)<->v1216(VarCurr)).
% 94.03/93.43  all VarCurr (v1216(VarCurr)<->v1218(VarCurr)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1482(VarNext)-> (v1218(VarNext)<->v1218(VarCurr)))).
% 94.03/93.43  all VarNext (v1482(VarNext)-> (v1218(VarNext)<->v1490(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1490(VarNext)<->v1488(VarCurr))).
% 94.03/93.43  all VarCurr (-v1491(VarCurr)-> (v1488(VarCurr)<->v1492(VarCurr))).
% 94.03/93.43  all VarCurr (v1491(VarCurr)-> (v1488(VarCurr)<->$F)).
% 94.03/93.43  all VarCurr (-v1493(VarCurr)-> (v1492(VarCurr)<->$F)).
% 94.03/93.43  all VarCurr (v1493(VarCurr)-> (v1492(VarCurr)<->$T)).
% 94.03/93.43  all VarCurr (-v1493(VarCurr)<->v1222(VarCurr,bitIndex0)).
% 94.03/93.43  all VarCurr (-v1491(VarCurr)<->v1220(VarCurr)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1482(VarNext)<->v1483(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1483(VarNext)<->v1484(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1484(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarCurr (-v1470(VarCurr)-> (v1222(VarCurr,bitIndex0)<->$F)).
% 94.03/93.43  all VarCurr (v1470(VarCurr)-> (v1222(VarCurr,bitIndex0)<->$T)).
% 94.03/93.43  all VarCurr (v1470(VarCurr)<->v1471(VarCurr)|v1478(VarCurr)).
% 94.03/93.43  all VarCurr (v1478(VarCurr)<->v1479(VarCurr)&v1400(VarCurr)).
% 94.03/93.43  all VarCurr (v1479(VarCurr)<->v1474(VarCurr)&v1186(VarCurr)).
% 94.03/93.43  all VarCurr (v1471(VarCurr)<->v1472(VarCurr)|v1475(VarCurr)).
% 94.03/93.43  all VarCurr (v1475(VarCurr)<->v1476(VarCurr)&v1397(VarCurr)).
% 94.03/93.43  all VarCurr (v1476(VarCurr)<->v1474(VarCurr)&v1186(VarCurr)).
% 94.03/93.43  all VarCurr (-v1474(VarCurr)<->v1224(VarCurr)).
% 94.03/93.43  all VarCurr (v1472(VarCurr)<->v1473(VarCurr)&v1391(VarCurr)).
% 94.03/93.43  all VarCurr (-v1473(VarCurr)<->v1224(VarCurr)).
% 94.03/93.43  all VarNext (v1377(VarNext,bitIndex2)<->v1462(VarNext,bitIndex1)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1463(VarNext)-> (v1462(VarNext,bitIndex3)<->v1377(VarCurr,bitIndex4))& (v1462(VarNext,bitIndex2)<->v1377(VarCurr,bitIndex3))& (v1462(VarNext,bitIndex1)<->v1377(VarCurr,bitIndex2))& (v1462(VarNext,bitIndex0)<->v1377(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1463(VarNext)-> (all B (range_3_0(B)-> (v1462(VarNext,B)<->v1415(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1463(VarNext)<->v1464(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1464(VarNext)<->v1466(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1466(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarCurr (-v1457(VarCurr)-> (v1222(VarCurr,bitIndex2)<->$F)).
% 94.03/93.43  all VarCurr (v1457(VarCurr)-> (v1222(VarCurr,bitIndex2)<->$T)).
% 94.03/93.43  all VarCurr (v1457(VarCurr)<->v1458(VarCurr)|v1459(VarCurr)).
% 94.03/93.43  all VarCurr (v1459(VarCurr)<->v1460(VarCurr)&v1397(VarCurr)).
% 94.03/93.43  all VarCurr (-v1460(VarCurr)<->v1186(VarCurr)).
% 94.03/93.43  all VarCurr (v1458(VarCurr)<-> ($T<->v1377(VarCurr,bitIndex1))).
% 94.03/93.43  all VarNext (v1377(VarNext,bitIndex1)<->v1449(VarNext,bitIndex0)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1450(VarNext)-> (v1449(VarNext,bitIndex3)<->v1377(VarCurr,bitIndex4))& (v1449(VarNext,bitIndex2)<->v1377(VarCurr,bitIndex3))& (v1449(VarNext,bitIndex1)<->v1377(VarCurr,bitIndex2))& (v1449(VarNext,bitIndex0)<->v1377(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1450(VarNext)-> (all B (range_3_0(B)-> (v1449(VarNext,B)<->v1415(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1450(VarNext)<->v1451(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1451(VarNext)<->v1453(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1453(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarCurr (-v1435(VarCurr)-> (v1222(VarCurr,bitIndex1)<->$F)).
% 94.03/93.43  all VarCurr (v1435(VarCurr)-> (v1222(VarCurr,bitIndex1)<->$T)).
% 94.03/93.43  all VarCurr (v1435(VarCurr)<->v1436(VarCurr)|v1446(VarCurr)).
% 94.03/93.43  all VarCurr (v1446(VarCurr)<->v1447(VarCurr)&v1400(VarCurr)).
% 94.03/93.43  all VarCurr (v1447(VarCurr)<->v1445(VarCurr)&v1368(VarCurr,bitIndex1)).
% 94.03/93.43  all VarCurr (v1436(VarCurr)<->v1437(VarCurr)|v1442(VarCurr)).
% 94.03/93.43  all VarCurr (v1442(VarCurr)<->v1443(VarCurr)&v1397(VarCurr)).
% 94.03/93.43  all VarCurr (v1443(VarCurr)<->v1445(VarCurr)&v1368(VarCurr,bitIndex1)).
% 94.03/93.43  all VarCurr (v1445(VarCurr)<->v1396(VarCurr)&v1441(VarCurr)).
% 94.03/93.43  all VarCurr (v1437(VarCurr)<->v1438(VarCurr)&v1391(VarCurr)).
% 94.03/93.43  all VarCurr (v1438(VarCurr)<->v1440(VarCurr)&v1368(VarCurr,bitIndex1)).
% 94.03/93.43  all VarCurr (v1440(VarCurr)<->v1224(VarCurr)&v1441(VarCurr)).
% 94.03/93.43  all VarCurr (-v1441(VarCurr)<->v1368(VarCurr,bitIndex0)).
% 94.03/93.43  all VarNext (v1377(VarNext,bitIndex4)<->v1427(VarNext,bitIndex3)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1428(VarNext)-> (v1427(VarNext,bitIndex3)<->v1377(VarCurr,bitIndex4))& (v1427(VarNext,bitIndex2)<->v1377(VarCurr,bitIndex3))& (v1427(VarNext,bitIndex1)<->v1377(VarCurr,bitIndex2))& (v1427(VarNext,bitIndex0)<->v1377(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1428(VarNext)-> (all B (range_3_0(B)-> (v1427(VarNext,B)<->v1415(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1428(VarNext)<->v1429(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1429(VarNext)<->v1431(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1431(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarCurr (-v1421(VarCurr)-> (v1222(VarCurr,bitIndex4)<->$F)).
% 94.03/93.43  all VarCurr (v1421(VarCurr)-> (v1222(VarCurr,bitIndex4)<->$T)).
% 94.03/93.43  all VarCurr (v1421(VarCurr)<->v1422(VarCurr)|v1423(VarCurr)).
% 94.03/93.43  all VarCurr (v1423(VarCurr)<->v1424(VarCurr)&v1400(VarCurr)).
% 94.03/93.43  all VarCurr (-v1424(VarCurr)<->v1186(VarCurr)).
% 94.03/93.43  all VarCurr (v1422(VarCurr)<-> ($T<->v1377(VarCurr,bitIndex3))).
% 94.03/93.43  all VarNext (v1377(VarNext,bitIndex3)<->v1404(VarNext,bitIndex2)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1405(VarNext)-> (v1404(VarNext,bitIndex3)<->v1377(VarCurr,bitIndex4))& (v1404(VarNext,bitIndex2)<->v1377(VarCurr,bitIndex3))& (v1404(VarNext,bitIndex1)<->v1377(VarCurr,bitIndex2))& (v1404(VarNext,bitIndex0)<->v1377(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1405(VarNext)-> (all B (range_3_0(B)-> (v1404(VarNext,B)<->v1415(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v1415(VarNext,B)<->v1413(VarCurr,B))))).
% 94.03/93.43  all VarCurr (-v1416(VarCurr)-> (v1413(VarCurr,bitIndex3)<->v1222(VarCurr,bitIndex4))& (v1413(VarCurr,bitIndex2)<->v1222(VarCurr,bitIndex3))& (v1413(VarCurr,bitIndex1)<->v1222(VarCurr,bitIndex2))& (v1413(VarCurr,bitIndex0)<->v1222(VarCurr,bitIndex1))).
% 94.03/93.43  all VarCurr (v1416(VarCurr)-> (all B (range_3_0(B)-> (v1413(VarCurr,B)<->$F)))).
% 94.03/93.43  all VarCurr (-v1416(VarCurr)<->v1220(VarCurr)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1405(VarNext)<->v1406(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1406(VarNext)<->v1407(VarNext)&v1402(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1407(VarNext)<->v1409(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1409(VarNext)<->v1402(VarCurr))).
% 94.03/93.43  all VarCurr (v1402(VarCurr)<->v288(VarCurr)).
% 94.03/93.43  all VarCurr (-v1384(VarCurr)-> (v1222(VarCurr,bitIndex3)<->$F)).
% 94.03/93.43  all VarCurr (v1384(VarCurr)-> (v1222(VarCurr,bitIndex3)<->$T)).
% 94.03/93.43  all VarCurr (v1384(VarCurr)<->v1385(VarCurr)|v1398(VarCurr)).
% 94.03/93.43  all VarCurr (v1398(VarCurr)<->v1399(VarCurr)&v1400(VarCurr)).
% 94.03/93.43  all VarCurr (v1400(VarCurr)<-> ($T<->v1377(VarCurr,bitIndex4))).
% 94.03/93.43  all VarCurr (v1399(VarCurr)<->v1395(VarCurr)&v1390(VarCurr)).
% 94.03/93.43  all VarCurr (v1385(VarCurr)<->v1386(VarCurr)|v1392(VarCurr)).
% 94.03/93.43  all VarCurr (v1392(VarCurr)<->v1393(VarCurr)&v1397(VarCurr)).
% 94.03/93.43  all VarCurr (v1397(VarCurr)<-> ($T<->v1377(VarCurr,bitIndex2))).
% 94.03/93.43  all VarCurr (v1393(VarCurr)<->v1395(VarCurr)&v1390(VarCurr)).
% 94.03/93.43  all VarCurr (v1395(VarCurr)<->v1396(VarCurr)&v1368(VarCurr,bitIndex0)).
% 94.03/93.43  all VarCurr (v1396(VarCurr)<->v1224(VarCurr)&v1186(VarCurr)).
% 94.03/93.43  all VarCurr (v1386(VarCurr)<->v1387(VarCurr)&v1391(VarCurr)).
% 94.03/93.43  all VarCurr (v1391(VarCurr)<-> ($T<->v1377(VarCurr,bitIndex0))).
% 94.03/93.43  v1377(constB0,bitIndex4)<->$F.
% 94.03/93.43  v1377(constB0,bitIndex3)<->$F.
% 94.03/93.43  v1377(constB0,bitIndex2)<->$F.
% 94.03/93.43  v1377(constB0,bitIndex1)<->$F.
% 94.03/93.43  all VarCurr (v1387(VarCurr)<->v1389(VarCurr)&v1390(VarCurr)).
% 94.03/93.43  all VarCurr (-v1390(VarCurr)<->v1368(VarCurr,bitIndex1)).
% 94.03/93.43  all VarCurr (v1389(VarCurr)<->v1224(VarCurr)&v1368(VarCurr,bitIndex0)).
% 94.03/93.43  all VarCurr B (range_1_0(B)-> (v1368(VarCurr,B)<->v1370(VarCurr,B))).
% 94.03/93.43  all VarCurr B (range_1_0(B)-> (v1370(VarCurr,B)<->v1372(VarCurr,B))).
% 94.03/93.43  all VarCurr (v1372(VarCurr,bitIndex0)<->v36(VarCurr,bitIndex4)).
% 94.03/93.43  all VarCurr (v1372(VarCurr,bitIndex1)<->v1374(VarCurr)).
% 94.03/93.43  all VarCurr (v1374(VarCurr)<->v36(VarCurr,bitIndex1)|v36(VarCurr,bitIndex7)).
% 94.03/93.43  all VarCurr (v1224(VarCurr)<->v1226(VarCurr)).
% 94.03/93.43  all VarCurr (v1226(VarCurr)<->v1228(VarCurr)).
% 94.03/93.43  all VarCurr (v1228(VarCurr)<->v1366(VarCurr)|v36(VarCurr,bitIndex7)).
% 94.03/93.43  all VarCurr (v1366(VarCurr)<->v36(VarCurr,bitIndex1)|v36(VarCurr,bitIndex4)).
% 94.03/93.43  all VarCurr (-v1333(VarCurr)-> (v36(VarCurr,bitIndex4)<->$F)).
% 94.03/93.43  all VarCurr (v1333(VarCurr)-> (v36(VarCurr,bitIndex4)<->$T)).
% 94.03/93.43  all VarCurr (v1333(VarCurr)<->v1334(VarCurr)|v1363(VarCurr)).
% 94.03/93.43  all VarCurr (v1363(VarCurr)<->v1364(VarCurr)&v1323(VarCurr)).
% 94.03/93.43  all VarCurr (v1364(VarCurr)<->v1342(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1334(VarCurr)<->v1335(VarCurr)|v1361(VarCurr)).
% 94.03/93.43  all VarCurr (v1361(VarCurr)<->v1362(VarCurr)&v1300(VarCurr)).
% 94.03/93.43  all VarCurr (v1362(VarCurr)<->v1352(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1335(VarCurr)<->v1336(VarCurr)|v1358(VarCurr)).
% 94.03/93.43  all VarCurr (v1358(VarCurr)<->v1359(VarCurr)&v1360(VarCurr)).
% 94.03/93.43  all VarCurr (v1360(VarCurr)<-> ($T<->v31(VarCurr,bitIndex6))).
% 94.03/93.43  all VarCurr (v1359(VarCurr)<->v1342(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1336(VarCurr)<->v1337(VarCurr)|v1356(VarCurr)).
% 94.03/93.43  all VarCurr (v1356(VarCurr)<->v1357(VarCurr)&v1278(VarCurr)).
% 94.03/93.43  all VarCurr (v1357(VarCurr)<->v1352(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1337(VarCurr)<->v1338(VarCurr)|v1353(VarCurr)).
% 94.03/93.43  all VarCurr (v1353(VarCurr)<->v1354(VarCurr)&v1355(VarCurr)).
% 94.03/93.43  all VarCurr (v1355(VarCurr)<-> ($T<->v31(VarCurr,bitIndex3))).
% 94.03/93.43  all VarCurr (v1354(VarCurr)<->v1342(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1338(VarCurr)<->v1339(VarCurr)|v1349(VarCurr)).
% 94.03/93.43  all VarCurr (v1349(VarCurr)<->v1350(VarCurr)&v1238(VarCurr)).
% 94.03/93.43  all VarCurr (v1350(VarCurr)<->v1352(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1352(VarCurr)<->v1342(VarCurr)&v1180(VarCurr)).
% 94.03/93.43  all VarCurr (v1339(VarCurr)<->v1340(VarCurr)&v1348(VarCurr)).
% 94.03/93.43  all VarCurr (v1348(VarCurr)<-> ($T<->v31(VarCurr,bitIndex0))).
% 94.03/93.43  all VarCurr (v1340(VarCurr)<->v1342(VarCurr)&v1168(VarCurr)).
% 94.03/93.43  all VarCurr (v1342(VarCurr)<->v1343(VarCurr)&v1347(VarCurr)).
% 94.03/93.43  all VarCurr (-v1347(VarCurr)<->v38(VarCurr)).
% 94.03/93.43  all VarCurr (v1343(VarCurr)<->v1344(VarCurr)&v1346(VarCurr)).
% 94.03/93.43  all VarCurr (-v1346(VarCurr)<->v903(VarCurr)).
% 94.03/93.43  all VarCurr (v1344(VarCurr)<->v1345(VarCurr)&v881(VarCurr)).
% 94.03/93.43  all VarCurr (-v1345(VarCurr)<->v87(VarCurr)).
% 94.03/93.43  all VarNext (v31(VarNext,bitIndex9)<->v1325(VarNext,bitIndex8)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1326(VarNext)-> (v1325(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1325(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1325(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1325(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1325(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1325(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1325(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1325(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1325(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1325(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1325(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1326(VarNext)-> (all B (range_10_0(B)-> (v1325(VarNext,B)<->v1253(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1326(VarNext)<->v1327(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1327(VarNext)<->v1329(VarNext)&v1240(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1329(VarNext)<->v1247(VarNext))).
% 94.03/93.43  all VarCurr (-v1311(VarCurr)-> (v36(VarCurr,bitIndex9)<->$F)).
% 94.03/93.43  all VarCurr (v1311(VarCurr)-> (v36(VarCurr,bitIndex9)<->$T)).
% 94.03/93.43  all VarCurr (v1311(VarCurr)<->v1312(VarCurr)|v1321(VarCurr)).
% 94.03/93.43  all VarCurr (v1321(VarCurr)<->v1322(VarCurr)&v1323(VarCurr)).
% 94.03/93.43  all VarCurr (v1323(VarCurr)<-> ($T<->v31(VarCurr,bitIndex9))).
% 94.03/93.43  all VarCurr (v1322(VarCurr)<->v38(VarCurr)).
% 94.03/93.43  all VarCurr (v1312(VarCurr)<->v1313(VarCurr)|v1319(VarCurr)).
% 94.03/93.43  all VarCurr (v1319(VarCurr)<->v1320(VarCurr)&v1300(VarCurr)).
% 94.03/93.43  all VarCurr (v1320(VarCurr)<->v38(VarCurr)&v1180(VarCurr)).
% 94.03/93.43  all VarCurr (v1313(VarCurr)<->v1314(VarCurr)|v1317(VarCurr)).
% 94.03/93.43  all VarCurr (v1317(VarCurr)<->v1318(VarCurr)&v1278(VarCurr)).
% 94.03/93.43  all VarCurr (v1318(VarCurr)<->v38(VarCurr)&v1180(VarCurr)).
% 94.03/93.43  all VarCurr (v1314(VarCurr)<->v1315(VarCurr)&v1238(VarCurr)).
% 94.03/93.43  all VarCurr (v1315(VarCurr)<->v38(VarCurr)&v1180(VarCurr)).
% 94.03/93.43  all VarNext (v31(VarNext,bitIndex8)<->v1302(VarNext,bitIndex7)).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1303(VarNext)-> (v1302(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1302(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1302(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1302(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1302(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1302(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1302(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1302(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1302(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1302(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1302(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.03/93.43  all VarNext (v1303(VarNext)-> (all B (range_10_0(B)-> (v1302(VarNext,B)<->v1253(VarNext,B))))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1303(VarNext)<->v1304(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1304(VarNext)<->v1306(VarNext)&v1240(VarNext))).
% 94.03/93.43  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1306(VarNext)<->v1247(VarNext))).
% 94.03/93.43  all VarCurr (-v1296(VarCurr)-> (v36(VarCurr,bitIndex8)<->$F)).
% 94.06/93.44  all VarCurr (v1296(VarCurr)-> (v36(VarCurr,bitIndex8)<->$T)).
% 94.06/93.44  all VarCurr (v1296(VarCurr)<->v1297(VarCurr)|v1298(VarCurr)).
% 94.06/93.44  all VarCurr (v1298(VarCurr)<->v1299(VarCurr)&v1300(VarCurr)).
% 94.06/93.44  all VarCurr (v1300(VarCurr)<-> ($T<->v31(VarCurr,bitIndex8))).
% 94.06/93.44  all VarCurr (-v1299(VarCurr)<->v1180(VarCurr)).
% 94.06/93.44  all VarCurr (v1297(VarCurr)<-> ($T<->v31(VarCurr,bitIndex7))).
% 94.06/93.44  all VarNext (v31(VarNext,bitIndex6)<->v1288(VarNext,bitIndex5)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1289(VarNext)-> (v1288(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1288(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1288(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1288(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1288(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1288(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1288(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1288(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1288(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1288(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1288(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.06/93.44  all VarNext (v1289(VarNext)-> (all B (range_10_0(B)-> (v1288(VarNext,B)<->v1253(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1289(VarNext)<->v1290(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1290(VarNext)<->v1292(VarNext)&v1240(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1292(VarNext)<->v1247(VarNext))).
% 94.06/93.44  all VarNext (v31(VarNext,bitIndex5)<->v1280(VarNext,bitIndex4)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1281(VarNext)-> (v1280(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1280(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1280(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1280(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1280(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1280(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1280(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1280(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1280(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1280(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1280(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.06/93.44  all VarNext (v1281(VarNext)-> (all B (range_10_0(B)-> (v1280(VarNext,B)<->v1253(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1281(VarNext)<->v1282(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1282(VarNext)<->v1284(VarNext)&v1240(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1284(VarNext)<->v1247(VarNext))).
% 94.06/93.44  all VarCurr (-v1274(VarCurr)-> (v36(VarCurr,bitIndex5)<->$F)).
% 94.06/93.44  all VarCurr (v1274(VarCurr)-> (v36(VarCurr,bitIndex5)<->$T)).
% 94.06/93.44  all VarCurr (v1274(VarCurr)<->v1275(VarCurr)|v1276(VarCurr)).
% 94.06/93.44  all VarCurr (v1276(VarCurr)<->v1277(VarCurr)&v1278(VarCurr)).
% 94.06/93.44  all VarCurr (v1278(VarCurr)<-> ($T<->v31(VarCurr,bitIndex5))).
% 94.06/93.44  all VarCurr (-v1277(VarCurr)<->v1180(VarCurr)).
% 94.06/93.44  all VarCurr (v1275(VarCurr)<-> ($T<->v31(VarCurr,bitIndex4))).
% 94.06/93.44  all VarNext (v31(VarNext,bitIndex4)<->v1266(VarNext,bitIndex3)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1267(VarNext)-> (v1266(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1266(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1266(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1266(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1266(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1266(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1266(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1266(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1266(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1266(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1266(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.06/93.44  all VarNext (v1267(VarNext)-> (all B (range_10_0(B)-> (v1266(VarNext,B)<->v1253(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1267(VarNext)<->v1268(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1268(VarNext)<->v1270(VarNext)&v1240(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1270(VarNext)<->v1247(VarNext))).
% 94.06/93.44  all VarNext (v31(VarNext,bitIndex3)<->v1258(VarNext,bitIndex2)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1259(VarNext)-> (v1258(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1258(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1258(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1258(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1258(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1258(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1258(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1258(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1258(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1258(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1258(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.06/93.44  all VarNext (v1259(VarNext)-> (all B (range_10_0(B)-> (v1258(VarNext,B)<->v1253(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1259(VarNext)<->v1260(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1260(VarNext)<->v1262(VarNext)&v1240(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1262(VarNext)<->v1247(VarNext))).
% 94.06/93.44  all VarNext (v31(VarNext,bitIndex2)<->v1242(VarNext,bitIndex1)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1243(VarNext)-> (v1242(VarNext,bitIndex10)<->v31(VarCurr,bitIndex11))& (v1242(VarNext,bitIndex9)<->v31(VarCurr,bitIndex10))& (v1242(VarNext,bitIndex8)<->v31(VarCurr,bitIndex9))& (v1242(VarNext,bitIndex7)<->v31(VarCurr,bitIndex8))& (v1242(VarNext,bitIndex6)<->v31(VarCurr,bitIndex7))& (v1242(VarNext,bitIndex5)<->v31(VarCurr,bitIndex6))& (v1242(VarNext,bitIndex4)<->v31(VarCurr,bitIndex5))& (v1242(VarNext,bitIndex3)<->v31(VarCurr,bitIndex4))& (v1242(VarNext,bitIndex2)<->v31(VarCurr,bitIndex3))& (v1242(VarNext,bitIndex1)<->v31(VarCurr,bitIndex2))& (v1242(VarNext,bitIndex0)<->v31(VarCurr,bitIndex1)))).
% 94.06/93.44  all VarNext (v1243(VarNext)-> (all B (range_10_0(B)-> (v1242(VarNext,B)<->v1253(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_10_0(B)-> (v1253(VarNext,B)<->v1251(VarCurr,B))))).
% 94.06/93.44  all VarCurr (-v1254(VarCurr)-> (v1251(VarCurr,bitIndex10)<->v36(VarCurr,bitIndex11))& (v1251(VarCurr,bitIndex9)<->v36(VarCurr,bitIndex10))& (v1251(VarCurr,bitIndex8)<->v36(VarCurr,bitIndex9))& (v1251(VarCurr,bitIndex7)<->v36(VarCurr,bitIndex8))& (v1251(VarCurr,bitIndex6)<->v36(VarCurr,bitIndex7))& (v1251(VarCurr,bitIndex5)<->v36(VarCurr,bitIndex6))& (v1251(VarCurr,bitIndex4)<->v36(VarCurr,bitIndex5))& (v1251(VarCurr,bitIndex3)<->v36(VarCurr,bitIndex4))& (v1251(VarCurr,bitIndex2)<->v36(VarCurr,bitIndex3))& (v1251(VarCurr,bitIndex1)<->v36(VarCurr,bitIndex2))& (v1251(VarCurr,bitIndex0)<->v36(VarCurr,bitIndex1))).
% 94.06/93.44  all VarCurr (v1254(VarCurr)-> (all B (range_10_0(B)-> (v1251(VarCurr,B)<->$F)))).
% 94.06/93.44  all VarCurr (-v1254(VarCurr)<->v33(VarCurr)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1243(VarNext)<->v1244(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1244(VarNext)<->v1245(VarNext)&v1240(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1245(VarNext)<->v1247(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1247(VarNext)<->v1240(VarCurr))).
% 94.06/93.44  all VarCurr (v1240(VarCurr)<->v288(VarCurr)).
% 94.06/93.44  all VarCurr (-v1233(VarCurr)-> (v36(VarCurr,bitIndex2)<->$F)).
% 94.06/93.44  all VarCurr (v1233(VarCurr)-> (v36(VarCurr,bitIndex2)<->$T)).
% 94.06/93.44  all VarCurr (v1233(VarCurr)<->v1234(VarCurr)|v1235(VarCurr)).
% 94.06/93.44  all VarCurr (v1235(VarCurr)<->v1236(VarCurr)&v1238(VarCurr)).
% 94.06/93.44  all VarCurr (v1238(VarCurr)<-> ($T<->v31(VarCurr,bitIndex2))).
% 94.06/93.44  all VarCurr (-v1236(VarCurr)<->v1180(VarCurr)).
% 94.06/93.44  all VarCurr (v1234(VarCurr)<-> ($T<->v31(VarCurr,bitIndex1))).
% 94.06/93.44  v31(constB0,bitIndex11)<->$F.
% 94.06/93.44  v31(constB0,bitIndex10)<->$F.
% 94.06/93.44  v31(constB0,bitIndex9)<->$F.
% 94.06/93.44  v31(constB0,bitIndex8)<->$F.
% 94.06/93.44  v31(constB0,bitIndex7)<->$F.
% 94.06/93.44  v31(constB0,bitIndex6)<->$F.
% 94.06/93.44  v31(constB0,bitIndex5)<->$F.
% 94.06/93.44  v31(constB0,bitIndex4)<->$F.
% 94.06/93.44  v31(constB0,bitIndex3)<->$F.
% 94.06/93.44  v31(constB0,bitIndex2)<->$F.
% 94.06/93.44  v31(constB0,bitIndex1)<->$F.
% 94.06/93.44  -b00000000000(bitIndex10).
% 94.06/93.44  -b00000000000(bitIndex9).
% 94.06/93.44  -b00000000000(bitIndex8).
% 94.06/93.44  -b00000000000(bitIndex7).
% 94.06/93.44  -b00000000000(bitIndex6).
% 94.06/93.44  -b00000000000(bitIndex5).
% 94.06/93.44  -b00000000000(bitIndex4).
% 94.06/93.44  -b00000000000(bitIndex3).
% 94.06/93.44  -b00000000000(bitIndex2).
% 94.06/93.44  -b00000000000(bitIndex1).
% 94.06/93.44  -b00000000000(bitIndex0).
% 94.06/93.44  v31(constB0,bitIndex0)<->$T.
% 94.06/93.44  all VarCurr (v1220(VarCurr)<->v12(VarCurr)).
% 94.06/93.44  all VarCurr (v1198(VarCurr)<->v1200(VarCurr)).
% 94.06/93.44  all VarCurr (v1200(VarCurr)<->v16(VarCurr)).
% 94.06/93.44  all VarCurr (v1168(VarCurr)<->v1170(VarCurr)).
% 94.06/93.44  all VarCurr (v1170(VarCurr)<->v1172(VarCurr)).
% 94.06/93.44  all VarCurr (v1172(VarCurr)<->v1174(VarCurr,bitIndex3)).
% 94.06/93.44  all VarCurr (v1174(VarCurr,bitIndex3)<->v743(VarCurr,bitIndex3)).
% 94.06/93.44  all VarCurr (v1162(VarCurr)<->v1164(VarCurr)).
% 94.06/93.44  all VarCurr (v1164(VarCurr)<->v1166(VarCurr)).
% 94.06/93.44  all VarCurr (v1166(VarCurr)<->v915(VarCurr,bitIndex1)).
% 94.06/93.44  all VarCurr (v1148(VarCurr)<->v1156(VarCurr)&v1158(VarCurr)).
% 94.06/93.44  all VarCurr (-v1158(VarCurr)<->v1150(VarCurr)).
% 94.06/93.44  all VarCurr (v1156(VarCurr)<->v1157(VarCurr)&v909(VarCurr)).
% 94.06/93.44  all VarCurr (-v1157(VarCurr)<->v1031(VarCurr)).
% 94.06/93.44  all VarCurr (v1150(VarCurr)<->v1152(VarCurr)).
% 94.06/93.44  all VarCurr (v1152(VarCurr)<->v1154(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v1154(VarCurr,bitIndex0)<->v1142(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v1142(VarCurr,bitIndex0)<->v919(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v919(VarCurr,bitIndex0)<->v921(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v921(VarCurr,bitIndex0)<->v1017(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v1029(VarCurr)<->v1146(VarCurr)&v1132(VarCurr)).
% 94.06/93.44  all VarCurr (-v1146(VarCurr)<->v1031(VarCurr)).
% 94.06/93.44  all VarCurr (v1132(VarCurr)<->v1134(VarCurr)).
% 94.06/93.44  all VarCurr (v1134(VarCurr)<->v1136(VarCurr)).
% 94.06/93.44  all VarCurr (v1136(VarCurr)<->v1144(VarCurr)&v1138(VarCurr)).
% 94.06/93.44  all VarCurr (-v1144(VarCurr)<->v915(VarCurr,bitIndex1)).
% 94.06/93.44  all VarCurr (v1138(VarCurr)<->v1140(VarCurr)).
% 94.06/93.44  all VarCurr (v1140(VarCurr)<->v1142(VarCurr,bitIndex15)).
% 94.06/93.44  all VarCurr (v1142(VarCurr,bitIndex15)<->v919(VarCurr,bitIndex15)).
% 94.06/93.44  all VarCurr (v919(VarCurr,bitIndex15)<->v921(VarCurr,bitIndex15)).
% 94.06/93.44  all VarCurr (v921(VarCurr,bitIndex15)<->v1017(VarCurr,bitIndex15)).
% 94.06/93.44  all VarCurr (v1031(VarCurr)<->v1033(VarCurr)).
% 94.06/93.44  all VarCurr (v1033(VarCurr)<->v1035(VarCurr)).
% 94.06/93.44  all VarCurr (v1035(VarCurr)<-> (v1037(VarCurr,bitIndex4)<->$F)& (v1037(VarCurr,bitIndex3)<->$F)& (v1037(VarCurr,bitIndex2)<->$F)& (v1037(VarCurr,bitIndex1)<->$F)& (v1037(VarCurr,bitIndex0)<->$F)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1118(VarNext)-> (all B (range_4_0(B)-> (v1037(VarNext,B)<->v1037(VarCurr,B)))))).
% 94.06/93.44  all VarNext (v1118(VarNext)-> (all B (range_4_0(B)-> (v1037(VarNext,B)<->v1126(VarNext,B))))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v1126(VarNext,B)<->v1124(VarCurr,B))))).
% 94.06/93.44  all VarCurr (-v1127(VarCurr)-> (all B (range_4_0(B)-> (v1124(VarCurr,B)<->v1039(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1127(VarCurr)-> (all B (range_4_0(B)-> (v1124(VarCurr,B)<->$F)))).
% 94.06/93.44  all VarCurr (-v1127(VarCurr)<->v928(VarCurr)).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1118(VarNext)<->v1119(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1119(VarNext)<->v1120(VarNext)&v925(VarNext))).
% 94.06/93.44  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1120(VarNext)<->v984(VarNext))).
% 94.06/93.44  all VarCurr (-v1042(VarCurr)& -v1044(VarCurr)& -v1085(VarCurr)-> (all B (range_4_0(B)-> (v1039(VarCurr,B)<->v1037(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1085(VarCurr)-> (all B (range_4_0(B)-> (v1039(VarCurr,B)<->v1087(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1044(VarCurr)-> (all B (range_4_0(B)-> (v1039(VarCurr,B)<->v1046(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1042(VarCurr)-> (all B (range_4_0(B)-> (v1039(VarCurr,B)<->v1037(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1114(VarCurr)<-> (v1115(VarCurr,bitIndex1)<->$T)& (v1115(VarCurr,bitIndex0)<->$T)).
% 94.06/93.44  all VarCurr (v1115(VarCurr,bitIndex0)<->v1023(VarCurr)).
% 94.06/93.44  all VarCurr (v1115(VarCurr,bitIndex1)<->v945(VarCurr)).
% 94.06/93.44  all VarCurr (-v1088(VarCurr)-> (all B (range_4_0(B)-> (v1087(VarCurr,B)<->v1089(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1088(VarCurr)-> (all B (range_4_0(B)-> (v1087(VarCurr,B)<->b10000(B))))).
% 94.06/93.44  all VarCurr (v1089(VarCurr,bitIndex0)<->v1111(VarCurr)).
% 94.06/93.44  all VarCurr (v1089(VarCurr,bitIndex1)<->v1109(VarCurr)).
% 94.06/93.44  all VarCurr (v1089(VarCurr,bitIndex2)<->v1104(VarCurr)).
% 94.06/93.44  all VarCurr (v1089(VarCurr,bitIndex3)<->v1099(VarCurr)).
% 94.06/93.44  all VarCurr (v1089(VarCurr,bitIndex4)<->v1091(VarCurr)).
% 94.06/93.44  all VarCurr (v1109(VarCurr)<->v1110(VarCurr)&v1113(VarCurr)).
% 94.06/93.44  all VarCurr (v1113(VarCurr)<->v1037(VarCurr,bitIndex0)|v1037(VarCurr,bitIndex1)).
% 94.06/93.44  all VarCurr (v1110(VarCurr)<->v1111(VarCurr)|v1112(VarCurr)).
% 94.06/93.44  all VarCurr (-v1112(VarCurr)<->v1037(VarCurr,bitIndex1)).
% 94.06/93.44  all VarCurr (-v1111(VarCurr)<->v1037(VarCurr,bitIndex0)).
% 94.06/93.44  all VarCurr (v1104(VarCurr)<->v1105(VarCurr)&v1108(VarCurr)).
% 94.06/93.44  all VarCurr (v1108(VarCurr)<->v1096(VarCurr)|v1037(VarCurr,bitIndex2)).
% 94.06/93.44  all VarCurr (v1105(VarCurr)<->v1106(VarCurr)|v1107(VarCurr)).
% 94.06/93.44  all VarCurr (-v1107(VarCurr)<->v1037(VarCurr,bitIndex2)).
% 94.06/93.44  all VarCurr (-v1106(VarCurr)<->v1096(VarCurr)).
% 94.06/93.44  all VarCurr (v1099(VarCurr)<->v1100(VarCurr)&v1103(VarCurr)).
% 94.06/93.44  all VarCurr (v1103(VarCurr)<->v1095(VarCurr)|v1037(VarCurr,bitIndex3)).
% 94.06/93.44  all VarCurr (v1100(VarCurr)<->v1101(VarCurr)|v1102(VarCurr)).
% 94.06/93.44  all VarCurr (-v1102(VarCurr)<->v1037(VarCurr,bitIndex3)).
% 94.06/93.44  all VarCurr (-v1101(VarCurr)<->v1095(VarCurr)).
% 94.06/93.44  all VarCurr (v1091(VarCurr)<->v1092(VarCurr)&v1098(VarCurr)).
% 94.06/93.44  all VarCurr (v1098(VarCurr)<->v1094(VarCurr)|v1037(VarCurr,bitIndex4)).
% 94.06/93.44  all VarCurr (v1092(VarCurr)<->v1093(VarCurr)|v1097(VarCurr)).
% 94.06/93.44  all VarCurr (-v1097(VarCurr)<->v1037(VarCurr,bitIndex4)).
% 94.06/93.44  all VarCurr (-v1093(VarCurr)<->v1094(VarCurr)).
% 94.06/93.44  all VarCurr (v1094(VarCurr)<->v1095(VarCurr)&v1037(VarCurr,bitIndex3)).
% 94.06/93.44  all VarCurr (v1095(VarCurr)<->v1096(VarCurr)&v1037(VarCurr,bitIndex2)).
% 94.06/93.44  all VarCurr (v1096(VarCurr)<->v1037(VarCurr,bitIndex0)&v1037(VarCurr,bitIndex1)).
% 94.06/93.44  all VarCurr (v1088(VarCurr)<-> (v1037(VarCurr,bitIndex4)<->$T)& (v1037(VarCurr,bitIndex3)<->$F)& (v1037(VarCurr,bitIndex2)<->$F)& (v1037(VarCurr,bitIndex1)<->$F)& (v1037(VarCurr,bitIndex0)<->$F)).
% 94.06/93.44  b10000(bitIndex4).
% 94.06/93.44  -b10000(bitIndex3).
% 94.06/93.44  -b10000(bitIndex2).
% 94.06/93.44  -b10000(bitIndex1).
% 94.06/93.44  -b10000(bitIndex0).
% 94.06/93.44  all VarCurr (v1085(VarCurr)<-> (v1086(VarCurr,bitIndex1)<->$T)& (v1086(VarCurr,bitIndex0)<->$F)).
% 94.06/93.44  all VarCurr (v1086(VarCurr,bitIndex0)<->v1023(VarCurr)).
% 94.06/93.44  all VarCurr (v1086(VarCurr,bitIndex1)<->v945(VarCurr)).
% 94.06/93.44  all VarCurr (-v1047(VarCurr)-> (all B (range_31_0(B)-> (v1046(VarCurr,B)<->v1048(VarCurr,B))))).
% 94.06/93.44  all VarCurr (v1047(VarCurr)-> (all B (range_31_0(B)-> (v1046(VarCurr,B)<->$F)))).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex6)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex7)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex8)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex9)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex10)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex11)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex12)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex13)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex14)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex15)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex16)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex17)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex18)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex19)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex20)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex21)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex22)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex23)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex24)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex25)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex26)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex27)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex28)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex29)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex30)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr (v1048(VarCurr,bitIndex31)<->v1049(VarCurr,bitIndex5)).
% 94.06/93.44  all VarCurr B (range_5_0(B)-> (v1048(VarCurr,B)<->v1049(VarCurr,B))).
% 94.06/93.44  all VarCurr (v1049(VarCurr,bitIndex0)<->v1083(VarCurr)).
% 94.06/93.44  all VarCurr (v1049(VarCurr,bitIndex1)<->v1081(VarCurr)).
% 94.06/93.44  all VarCurr (v1049(VarCurr,bitIndex2)<->v1077(VarCurr)).
% 94.06/93.45  all VarCurr (v1049(VarCurr,bitIndex3)<->v1073(VarCurr)).
% 94.06/93.45  all VarCurr (v1049(VarCurr,bitIndex4)<->v1069(VarCurr)).
% 94.06/93.45  all VarCurr (v1049(VarCurr,bitIndex5)<->v1051(VarCurr)).
% 94.06/93.45  all VarCurr (v1081(VarCurr)<->v1082(VarCurr)&v1084(VarCurr)).
% 94.06/93.45  all VarCurr (v1084(VarCurr)<->v1055(VarCurr,bitIndex0)|v1063(VarCurr)).
% 94.06/93.45  all VarCurr (v1082(VarCurr)<->v1083(VarCurr)|v1055(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (-v1083(VarCurr)<->v1055(VarCurr,bitIndex0)).
% 94.06/93.45  all VarCurr (v1077(VarCurr)<->v1078(VarCurr)&v1080(VarCurr)).
% 94.06/93.45  all VarCurr (v1080(VarCurr)<->v1061(VarCurr)|v1064(VarCurr)).
% 94.06/93.45  all VarCurr (v1078(VarCurr)<->v1079(VarCurr)|v1055(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (-v1079(VarCurr)<->v1061(VarCurr)).
% 94.06/93.45  all VarCurr (v1073(VarCurr)<->v1074(VarCurr)&v1076(VarCurr)).
% 94.06/93.45  all VarCurr (v1076(VarCurr)<->v1059(VarCurr)|v1065(VarCurr)).
% 94.06/93.45  all VarCurr (v1074(VarCurr)<->v1075(VarCurr)|v1055(VarCurr,bitIndex3)).
% 94.06/93.45  all VarCurr (-v1075(VarCurr)<->v1059(VarCurr)).
% 94.06/93.45  all VarCurr (v1069(VarCurr)<->v1070(VarCurr)&v1072(VarCurr)).
% 94.06/93.45  all VarCurr (v1072(VarCurr)<->v1057(VarCurr)|v1066(VarCurr)).
% 94.06/93.45  all VarCurr (v1070(VarCurr)<->v1071(VarCurr)|v1055(VarCurr,bitIndex4)).
% 94.06/93.45  all VarCurr (-v1071(VarCurr)<->v1057(VarCurr)).
% 94.06/93.45  all VarCurr (v1051(VarCurr)<->v1052(VarCurr)&v1067(VarCurr)).
% 94.06/93.45  all VarCurr (v1067(VarCurr)<->v1054(VarCurr)|v1068(VarCurr)).
% 94.06/93.45  all VarCurr (-v1068(VarCurr)<->v1055(VarCurr,bitIndex5)).
% 94.06/93.45  all VarCurr (v1052(VarCurr)<->v1053(VarCurr)|v1055(VarCurr,bitIndex5)).
% 94.06/93.45  all VarCurr (-v1053(VarCurr)<->v1054(VarCurr)).
% 94.06/93.45  all VarCurr (v1054(VarCurr)<->v1055(VarCurr,bitIndex4)|v1056(VarCurr)).
% 94.06/93.45  all VarCurr (v1056(VarCurr)<->v1057(VarCurr)&v1066(VarCurr)).
% 94.06/93.45  all VarCurr (-v1066(VarCurr)<->v1055(VarCurr,bitIndex4)).
% 94.06/93.45  all VarCurr (v1057(VarCurr)<->v1055(VarCurr,bitIndex3)|v1058(VarCurr)).
% 94.06/93.45  all VarCurr (v1058(VarCurr)<->v1059(VarCurr)&v1065(VarCurr)).
% 94.06/93.45  all VarCurr (-v1065(VarCurr)<->v1055(VarCurr,bitIndex3)).
% 94.06/93.45  all VarCurr (v1059(VarCurr)<->v1055(VarCurr,bitIndex2)|v1060(VarCurr)).
% 94.06/93.45  all VarCurr (v1060(VarCurr)<->v1061(VarCurr)&v1064(VarCurr)).
% 94.06/93.45  all VarCurr (-v1064(VarCurr)<->v1055(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v1061(VarCurr)<->v1055(VarCurr,bitIndex1)|v1062(VarCurr)).
% 94.06/93.45  all VarCurr (v1062(VarCurr)<->v1055(VarCurr,bitIndex0)&v1063(VarCurr)).
% 94.06/93.45  all VarCurr (-v1063(VarCurr)<->v1055(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (-v1055(VarCurr,bitIndex5)).
% 94.06/93.45  all VarCurr B (range_4_0(B)-> (v1055(VarCurr,B)<->v1037(VarCurr,B))).
% 94.06/93.45  all VarCurr (v1047(VarCurr)<-> (v1037(VarCurr,bitIndex4)<->$F)& (v1037(VarCurr,bitIndex3)<->$F)& (v1037(VarCurr,bitIndex2)<->$F)& (v1037(VarCurr,bitIndex1)<->$F)& (v1037(VarCurr,bitIndex0)<->$F)).
% 94.06/93.45  all VarCurr (v1044(VarCurr)<-> (v1045(VarCurr,bitIndex1)<->$F)& (v1045(VarCurr,bitIndex0)<->$T)).
% 94.06/93.45  all VarCurr (v1045(VarCurr,bitIndex0)<->v1023(VarCurr)).
% 94.06/93.45  all VarCurr (v1045(VarCurr,bitIndex1)<->v945(VarCurr)).
% 94.06/93.45  -v1037(constB0,bitIndex4).
% 94.06/93.45  -v1037(constB0,bitIndex3).
% 94.06/93.45  -v1037(constB0,bitIndex2).
% 94.06/93.45  -v1037(constB0,bitIndex1).
% 94.06/93.45  v1037(constB0,bitIndex0).
% 94.06/93.45  -b00001(bitIndex4).
% 94.06/93.45  -b00001(bitIndex3).
% 94.06/93.45  -b00001(bitIndex2).
% 94.06/93.45  -b00001(bitIndex1).
% 94.06/93.45  b00001(bitIndex0).
% 94.06/93.45  all VarCurr (v1042(VarCurr)<-> (v1043(VarCurr,bitIndex1)<->$F)& (v1043(VarCurr,bitIndex0)<->$F)).
% 94.06/93.45  all VarCurr (v1043(VarCurr,bitIndex0)<->v1023(VarCurr)).
% 94.06/93.45  all VarCurr (v1043(VarCurr,bitIndex1)<->v945(VarCurr)).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v1019_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (address(A)-> (all B (A=AssociatedAddressVar-> (range_17_0(B)-> (v1017(VarNext,B)<->v923_array(VarNext,A,B)))))))))).
% 94.06/93.45  all B (range_3_0(B)-> (v1019(constB0,B)<->$F)).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (-v1009(VarNext)-> (all B (range_17_0(B)-> (v923_array(VarNext,A,B)<->v923_1__array(VarNext,A,B))))))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (v1009(VarNext)-> (all B (range_17_0(B)-> (v923_array(VarNext,A,B)<->b000000000000000000(B))))))).
% 94.06/93.45  -b000000000000000000(bitIndex17).
% 94.06/93.45  -b000000000000000000(bitIndex16).
% 94.06/93.45  -b000000000000000000(bitIndex15).
% 94.06/93.45  -b000000000000000000(bitIndex14).
% 94.06/93.45  -b000000000000000000(bitIndex13).
% 94.06/93.45  -b000000000000000000(bitIndex12).
% 94.06/93.45  -b000000000000000000(bitIndex11).
% 94.06/93.45  -b000000000000000000(bitIndex10).
% 94.06/93.45  -b000000000000000000(bitIndex9).
% 94.06/93.45  -b000000000000000000(bitIndex8).
% 94.06/93.45  -b000000000000000000(bitIndex7).
% 94.06/93.45  -b000000000000000000(bitIndex6).
% 94.06/93.45  -b000000000000000000(bitIndex5).
% 94.06/93.45  -b000000000000000000(bitIndex4).
% 94.06/93.45  -b000000000000000000(bitIndex3).
% 94.06/93.45  -b000000000000000000(bitIndex2).
% 94.06/93.45  -b000000000000000000(bitIndex1).
% 94.06/93.45  -b000000000000000000(bitIndex0).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1009(VarNext)<->v1010(VarNext)&v1015(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1015(VarNext)<->v1006(VarCurr))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1010(VarNext)<->v1012(VarNext)&v925(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v1012(VarNext)<->v984(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (-(A=AssociatedAddressVar&v997(VarNext))-> (all B (range_17_0(B)-> (v923_1__array(VarNext,A,B)<->v923_array(VarCurr,A,B))))))))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (A=AssociatedAddressVar&v997(VarNext)-> (all B (range_17_0(B)-> (v923_1__array(VarNext,A,B)<->v930(VarNext,B))))))))).
% 94.06/93.45  all B (range_17_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v997(VarNext)<->v998(VarNext)&v1004(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v1004(VarNext)<->v1002(VarCurr))).
% 94.06/93.45  all VarCurr (v1002(VarCurr)<->v1005(VarCurr)&v945(VarCurr)).
% 94.06/93.45  all VarCurr (-v1005(VarCurr)<->v1006(VarCurr)).
% 94.06/93.45  all VarCurr (-v1006(VarCurr)<->v928(VarCurr)).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v998(VarNext)<->v999(VarNext)&v925(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v999(VarNext)<->v984(VarNext))).
% 94.06/93.45  -v923_array(constB0,b1111_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1111_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1111_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1110_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1110_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1110_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1101_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1101_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1101_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1100_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1100_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1100_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1011_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1011_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1011_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1010_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1010_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1010_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1001_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1001_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1001_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b1000_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b1000_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b1000_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0111_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0111_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0111_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0110_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0110_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0110_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0101_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0101_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0101_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0100_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0100_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0100_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0011_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0011_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0011_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0010_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0010_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0010_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0001_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0001_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0001_address_term,bitIndex17).
% 94.06/93.45  -v923_array(constB0,b0000_address_term,bitIndex0).
% 94.06/93.45  -v923_array(constB0,b0000_address_term,bitIndex15).
% 94.06/93.45  -v923_array(constB0,b0000_address_term,bitIndex17).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v980(VarNext)-> (all B (range_3_0(B)-> (v953(VarNext,B)<->v953(VarCurr,B)))))).
% 94.06/93.45  all VarNext (v980(VarNext)-> (all B (range_3_0(B)-> (v953(VarNext,B)<->v990(VarNext,B))))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v990(VarNext,B)<->v988(VarCurr,B))))).
% 94.06/93.45  all VarCurr (-v991(VarCurr)-> (all B (range_3_0(B)-> (v988(VarCurr,B)<->v955(VarCurr,B))))).
% 94.06/93.45  all VarCurr (v991(VarCurr)-> (all B (range_3_0(B)-> (v988(VarCurr,B)<->$F)))).
% 94.06/93.45  all VarCurr (-v991(VarCurr)<->v928(VarCurr)).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v980(VarNext)<->v981(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v981(VarNext)<->v982(VarNext)&v925(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v982(VarNext)<->v984(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v984(VarNext)<->v925(VarCurr))).
% 94.06/93.45  all VarCurr (-v945(VarCurr)-> (all B (range_3_0(B)-> (v955(VarCurr,B)<->v953(VarCurr,B))))).
% 94.06/93.45  all VarCurr (v945(VarCurr)-> (all B (range_3_0(B)-> (v955(VarCurr,B)<->v957(VarCurr,B))))).
% 94.06/93.45  all VarCurr (-v958(VarCurr)-> (all B (range_3_0(B)-> (v957(VarCurr,B)<->v959(VarCurr,B))))).
% 94.06/93.45  all VarCurr (v958(VarCurr)-> (all B (range_3_0(B)-> (v957(VarCurr,B)<->$F)))).
% 94.06/93.45  all VarCurr (v959(VarCurr,bitIndex0)<->v975(VarCurr)).
% 94.06/93.45  all VarCurr (v959(VarCurr,bitIndex1)<->v973(VarCurr)).
% 94.06/93.45  all VarCurr (v959(VarCurr,bitIndex2)<->v968(VarCurr)).
% 94.06/93.45  all VarCurr (v959(VarCurr,bitIndex3)<->v961(VarCurr)).
% 94.06/93.45  all VarCurr (v973(VarCurr)<->v974(VarCurr)&v977(VarCurr)).
% 94.06/93.45  all VarCurr (v977(VarCurr)<->v953(VarCurr,bitIndex0)|v953(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (v974(VarCurr)<->v975(VarCurr)|v976(VarCurr)).
% 94.06/93.45  all VarCurr (-v976(VarCurr)<->v953(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (-v975(VarCurr)<->v953(VarCurr,bitIndex0)).
% 94.06/93.45  all VarCurr (v968(VarCurr)<->v969(VarCurr)&v972(VarCurr)).
% 94.06/93.45  all VarCurr (v972(VarCurr)<->v965(VarCurr)|v953(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v969(VarCurr)<->v970(VarCurr)|v971(VarCurr)).
% 94.06/93.45  all VarCurr (-v971(VarCurr)<->v953(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (-v970(VarCurr)<->v965(VarCurr)).
% 94.06/93.45  all VarCurr (v961(VarCurr)<->v962(VarCurr)&v967(VarCurr)).
% 94.06/93.45  all VarCurr (v967(VarCurr)<->v964(VarCurr)|v953(VarCurr,bitIndex3)).
% 94.06/93.45  all VarCurr (v962(VarCurr)<->v963(VarCurr)|v966(VarCurr)).
% 94.06/93.45  all VarCurr (-v966(VarCurr)<->v953(VarCurr,bitIndex3)).
% 94.06/93.45  all VarCurr (-v963(VarCurr)<->v964(VarCurr)).
% 94.06/93.45  all VarCurr (v964(VarCurr)<->v965(VarCurr)&v953(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v965(VarCurr)<->v953(VarCurr,bitIndex0)&v953(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (v958(VarCurr)<-> (v953(VarCurr,bitIndex3)<->$T)& (v953(VarCurr,bitIndex2)<->$T)& (v953(VarCurr,bitIndex1)<->$T)& (v953(VarCurr,bitIndex0)<->$T)).
% 94.06/93.45  -v953(constB0,bitIndex3).
% 94.06/93.45  -v953(constB0,bitIndex2).
% 94.06/93.45  -v953(constB0,bitIndex1).
% 94.06/93.45  v953(constB0,bitIndex0).
% 94.06/93.45  all VarCurr (v945(VarCurr)<->v947(VarCurr)).
% 94.06/93.45  all VarCurr (v947(VarCurr)<->v949(VarCurr)).
% 94.06/93.45  all VarCurr (v949(VarCurr)<->v951(VarCurr)).
% 94.06/93.45  all VarCurr B (range_15_0(B)-> (v930(VarCurr,B)<->v938(VarCurr,B))).
% 94.06/93.45  all VarCurr ((v930(VarCurr,bitIndex17)<->v932(VarCurr,bitIndex1))& (v930(VarCurr,bitIndex16)<->v932(VarCurr,bitIndex0))).
% 94.06/93.45  all VarCurr B (range_15_0(B)-> (v938(VarCurr,B)<->v940(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_15_0(B)-> (v940(VarCurr,B)<->v942(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_1_0(B)-> (v932(VarCurr,B)<->v934(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_1_0(B)-> (v934(VarCurr,B)<->v936(VarCurr,B))).
% 94.06/93.45  all VarCurr (v928(VarCurr)<->v12(VarCurr)).
% 94.06/93.45  all VarCurr (v925(VarCurr)<->v288(VarCurr)).
% 94.06/93.45  all VarCurr (v903(VarCurr)<->v905(VarCurr)).
% 94.06/93.45  all VarCurr (v905(VarCurr)<->v91(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v91(VarCurr,bitIndex2)<->v898(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v892(VarCurr,bitIndex2)<->v896(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v894(VarCurr,bitIndex2)<->v895(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (v885(VarCurr,bitIndex2)<->v889(VarCurr,bitIndex2)).
% 94.06/93.45  all VarCurr (v887(VarCurr,bitIndex2)<->v888(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (v881(VarCurr)<->v883(VarCurr)).
% 94.06/93.45  all VarCurr (v883(VarCurr)<->v91(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr (v91(VarCurr,bitIndex1)<->v898(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v898(VarCurr,B)<->v899(VarCurr,B)|v892(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v899(VarCurr,B)<->v900(VarCurr,B)&v885(VarCurr,B))).
% 94.06/93.45  all VarCurr (v900(VarCurr,bitIndex0)<->v901(VarCurr)).
% 94.06/93.45  all VarCurr (v900(VarCurr,bitIndex1)<->v901(VarCurr)).
% 94.06/93.45  all VarCurr (v900(VarCurr,bitIndex2)<->v901(VarCurr)).
% 94.06/93.45  all VarCurr (v901(VarCurr)<->v93(VarCurr)).
% 94.06/93.45  all VarCurr (v892(VarCurr,bitIndex1)<->v896(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v896(VarCurr,B)<->v95(VarCurr,B)&v897(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v897(VarCurr,B)<-> -v894(VarCurr,B))).
% 94.06/93.45  all VarCurr (v894(VarCurr,bitIndex1)<->v895(VarCurr,bitIndex0)).
% 94.06/93.45  all VarCurr B (range_1_0(B)-> (v895(VarCurr,B)<->v894(VarCurr,B)|v95(VarCurr,B))).
% 94.06/93.45  all VarCurr (v894(VarCurr,bitIndex0)<->$F).
% 94.06/93.45  all VarCurr (v885(VarCurr,bitIndex1)<->v889(VarCurr,bitIndex1)).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v889(VarCurr,B)<->v97(VarCurr,B)&v890(VarCurr,B))).
% 94.06/93.45  all VarCurr B (range_2_0(B)-> (v890(VarCurr,B)<-> -v887(VarCurr,B))).
% 94.06/93.45  all VarCurr (v887(VarCurr,bitIndex1)<->v888(VarCurr,bitIndex0)).
% 94.06/93.45  all VarCurr B (range_1_0(B)-> (v888(VarCurr,B)<->v887(VarCurr,B)|v97(VarCurr,B))).
% 94.06/93.45  all B (range_1_0(B)<->bitIndex0=B|bitIndex1=B).
% 94.06/93.45  all VarCurr (v887(VarCurr,bitIndex0)<->$F).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v869_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (address(A)-> (all B (A=AssociatedAddressVar-> (range_66_0(B)-> (v867(VarNext,B)<->v749_array(VarNext,A,B)))))))))).
% 94.06/93.45  all B (range_3_0(B)-> (v869(constB0,B)<->$F)).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (-v859(VarNext)-> (all B (range_66_0(B)-> (v749_array(VarNext,A,B)<->v749_1__array(VarNext,A,B))))))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all A (v859(VarNext)-> (all B (range_66_0(B)-> (v749_array(VarNext,A,B)<->b0000000000000000000000000000000000000000000000000000000000000000000(B))))))).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex66).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex65).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex64).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 94.06/93.45  -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v859(VarNext)<->v860(VarNext)&v865(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v865(VarNext)<->v856(VarCurr))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v860(VarNext)<->v862(VarNext)&v751(VarNext))).
% 94.06/93.45  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v862(VarNext)<->v823(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (-(A=AssociatedAddressVar&v847(VarNext))-> (all B (range_66_0(B)-> (v749_1__array(VarNext,A,B)<->v749_array(VarCurr,A,B))))))))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all AssociatedAddressVar (v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)-> (all A (A=AssociatedAddressVar&v847(VarNext)-> (all B (range_66_0(B)-> (v749_1__array(VarNext,A,B)<->v756(VarNext,B))))))))).
% 94.06/93.46  all B (range_66_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v847(VarNext)<->v848(VarNext)&v854(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v854(VarNext)<->v852(VarCurr))).
% 94.06/93.46  all VarCurr (v852(VarCurr)<->v855(VarCurr)&v783(VarCurr)).
% 94.06/93.46  all VarCurr (-v855(VarCurr)<->v856(VarCurr)).
% 94.06/93.46  all VarCurr (-v856(VarCurr)<->v754(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v848(VarNext)<->v849(VarNext)&v751(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v849(VarNext)<->v823(VarNext))).
% 94.06/93.46  -v749_array(constB0,b1111_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1111_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1111_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1111_address_term,bitIndex66).
% 94.06/93.46  -v749_array(constB0,b1110_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1110_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1110_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1110_address_term,bitIndex66).
% 94.06/93.46  b1110(bitIndex3).
% 94.06/93.46  b1110(bitIndex2).
% 94.06/93.46  b1110(bitIndex1).
% 94.06/93.46  -b1110(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1101_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1101_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1101_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1101_address_term,bitIndex66).
% 94.06/93.46  b1101(bitIndex3).
% 94.06/93.46  b1101(bitIndex2).
% 94.06/93.46  -b1101(bitIndex1).
% 94.06/93.46  b1101(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1100_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1100_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1100_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1100_address_term,bitIndex66).
% 94.06/93.46  b1100(bitIndex3).
% 94.06/93.46  b1100(bitIndex2).
% 94.06/93.46  -b1100(bitIndex1).
% 94.06/93.46  -b1100(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1011_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1011_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1011_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1011_address_term,bitIndex66).
% 94.06/93.46  b1011(bitIndex3).
% 94.06/93.46  -b1011(bitIndex2).
% 94.06/93.46  b1011(bitIndex1).
% 94.06/93.46  b1011(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1010_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1010_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1010_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1010_address_term,bitIndex66).
% 94.06/93.46  b1010(bitIndex3).
% 94.06/93.46  -b1010(bitIndex2).
% 94.06/93.46  b1010(bitIndex1).
% 94.06/93.46  -b1010(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1001_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1001_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1001_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1001_address_term,bitIndex66).
% 94.06/93.46  b1001(bitIndex3).
% 94.06/93.46  -b1001(bitIndex2).
% 94.06/93.46  -b1001(bitIndex1).
% 94.06/93.46  b1001(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b1000_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b1000_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b1000_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b1000_address_term,bitIndex66).
% 94.06/93.46  b1000(bitIndex3).
% 94.06/93.46  -b1000(bitIndex2).
% 94.06/93.46  -b1000(bitIndex1).
% 94.06/93.46  -b1000(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0111_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0111_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0111_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0111_address_term,bitIndex66).
% 94.06/93.46  -b0111(bitIndex3).
% 94.06/93.46  b0111(bitIndex2).
% 94.06/93.46  b0111(bitIndex1).
% 94.06/93.46  b0111(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0110_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0110_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0110_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0110_address_term,bitIndex66).
% 94.06/93.46  -v749_array(constB0,b0101_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0101_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0101_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0101_address_term,bitIndex66).
% 94.06/93.46  -b0101(bitIndex3).
% 94.06/93.46  b0101(bitIndex2).
% 94.06/93.46  -b0101(bitIndex1).
% 94.06/93.46  b0101(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0100_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0100_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0100_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0100_address_term,bitIndex66).
% 94.06/93.46  -b0100(bitIndex3).
% 94.06/93.46  b0100(bitIndex2).
% 94.06/93.46  -b0100(bitIndex1).
% 94.06/93.46  -b0100(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0011_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0011_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0011_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0011_address_term,bitIndex66).
% 94.06/93.46  -b0011(bitIndex3).
% 94.06/93.46  -b0011(bitIndex2).
% 94.06/93.46  b0011(bitIndex1).
% 94.06/93.46  b0011(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0010_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0010_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0010_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0010_address_term,bitIndex66).
% 94.06/93.46  -b0010(bitIndex3).
% 94.06/93.46  -b0010(bitIndex2).
% 94.06/93.46  b0010(bitIndex1).
% 94.06/93.46  -b0010(bitIndex0).
% 94.06/93.46  -v749_array(constB0,b0001_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0001_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0001_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0001_address_term,bitIndex66).
% 94.06/93.46  -v749_array(constB0,b0000_address_term,bitIndex63).
% 94.06/93.46  -v749_array(constB0,b0000_address_term,bitIndex64).
% 94.06/93.46  -v749_array(constB0,b0000_address_term,bitIndex65).
% 94.06/93.46  -v749_array(constB0,b0000_address_term,bitIndex66).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v819(VarNext)-> (all B (range_3_0(B)-> (v791(VarNext,B)<->v791(VarCurr,B)))))).
% 94.06/93.46  all VarNext (v819(VarNext)-> (all B (range_3_0(B)-> (v791(VarNext,B)<->v829(VarNext,B))))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v829(VarNext,B)<->v827(VarCurr,B))))).
% 94.06/93.46  all VarCurr (-v830(VarCurr)-> (all B (range_3_0(B)-> (v827(VarCurr,B)<->v793(VarCurr,B))))).
% 94.06/93.46  all VarCurr (v830(VarCurr)-> (all B (range_3_0(B)-> (v827(VarCurr,B)<->$F)))).
% 94.06/93.46  all VarCurr (-v830(VarCurr)<->v754(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v819(VarNext)<->v820(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v820(VarNext)<->v821(VarNext)&v751(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v821(VarNext)<->v823(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v823(VarNext)<->v751(VarCurr))).
% 94.06/93.46  all VarCurr (-v783(VarCurr)-> (all B (range_3_0(B)-> (v793(VarCurr,B)<->v791(VarCurr,B))))).
% 94.06/93.46  all VarCurr (v783(VarCurr)-> (all B (range_3_0(B)-> (v793(VarCurr,B)<->v796(VarCurr,B))))).
% 94.06/93.46  all VarCurr (-v797(VarCurr)-> (all B (range_3_0(B)-> (v796(VarCurr,B)<->v798(VarCurr,B))))).
% 94.06/93.46  all VarCurr (v797(VarCurr)-> (all B (range_3_0(B)-> (v796(VarCurr,B)<->$F)))).
% 94.06/93.46  all VarCurr (v798(VarCurr,bitIndex0)<->v814(VarCurr)).
% 94.06/93.46  all VarCurr (v798(VarCurr,bitIndex1)<->v812(VarCurr)).
% 94.06/93.46  all VarCurr (v798(VarCurr,bitIndex2)<->v807(VarCurr)).
% 94.06/93.46  all VarCurr (v798(VarCurr,bitIndex3)<->v800(VarCurr)).
% 94.06/93.46  all VarCurr (v812(VarCurr)<->v813(VarCurr)&v816(VarCurr)).
% 94.06/93.46  all VarCurr (v816(VarCurr)<->v791(VarCurr,bitIndex0)|v791(VarCurr,bitIndex1)).
% 94.06/93.46  all VarCurr (v813(VarCurr)<->v814(VarCurr)|v815(VarCurr)).
% 94.06/93.46  all VarCurr (-v815(VarCurr)<->v791(VarCurr,bitIndex1)).
% 94.06/93.46  all VarCurr (-v814(VarCurr)<->v791(VarCurr,bitIndex0)).
% 94.06/93.46  all VarCurr (v807(VarCurr)<->v808(VarCurr)&v811(VarCurr)).
% 94.06/93.46  all VarCurr (v811(VarCurr)<->v804(VarCurr)|v791(VarCurr,bitIndex2)).
% 94.06/93.46  all VarCurr (v808(VarCurr)<->v809(VarCurr)|v810(VarCurr)).
% 94.06/93.46  all VarCurr (-v810(VarCurr)<->v791(VarCurr,bitIndex2)).
% 94.06/93.46  all VarCurr (-v809(VarCurr)<->v804(VarCurr)).
% 94.06/93.46  all VarCurr (v800(VarCurr)<->v801(VarCurr)&v806(VarCurr)).
% 94.06/93.46  all VarCurr (v806(VarCurr)<->v803(VarCurr)|v791(VarCurr,bitIndex3)).
% 94.06/93.46  all VarCurr (v801(VarCurr)<->v802(VarCurr)|v805(VarCurr)).
% 94.06/93.46  all VarCurr (-v805(VarCurr)<->v791(VarCurr,bitIndex3)).
% 94.06/93.46  all VarCurr (-v802(VarCurr)<->v803(VarCurr)).
% 94.06/93.46  all VarCurr (v803(VarCurr)<->v804(VarCurr)&v791(VarCurr,bitIndex2)).
% 94.06/93.46  all VarCurr (v804(VarCurr)<->v791(VarCurr,bitIndex0)&v791(VarCurr,bitIndex1)).
% 94.06/93.46  all VarCurr (v797(VarCurr)<-> (v791(VarCurr,bitIndex3)<->$T)& (v791(VarCurr,bitIndex2)<->$T)& (v791(VarCurr,bitIndex1)<->$T)& (v791(VarCurr,bitIndex0)<->$T)).
% 94.06/93.46  b1111(bitIndex3).
% 94.06/93.46  b1111(bitIndex2).
% 94.06/93.46  b1111(bitIndex1).
% 94.06/93.46  b1111(bitIndex0).
% 94.06/93.46  -v791(constB0,bitIndex3).
% 94.06/93.46  -v791(constB0,bitIndex2).
% 94.06/93.46  -v791(constB0,bitIndex1).
% 94.06/93.46  v791(constB0,bitIndex0).
% 94.06/93.46  -b0001(bitIndex3).
% 94.06/93.46  -b0001(bitIndex2).
% 94.06/93.46  -b0001(bitIndex1).
% 94.06/93.46  b0001(bitIndex0).
% 94.06/93.46  all VarCurr (v783(VarCurr)<->v785(VarCurr)).
% 94.06/93.46  all VarCurr (v785(VarCurr)<->v787(VarCurr)).
% 94.06/93.46  all VarCurr (v787(VarCurr)<->v789(VarCurr)).
% 94.06/93.46  all VarCurr B (range_10_0(B)-> (v756(VarCurr,B)<->v776(VarCurr,B))).
% 94.06/93.46  all VarCurr ((v756(VarCurr,bitIndex26)<->v770(VarCurr,bitIndex15))& (v756(VarCurr,bitIndex25)<->v770(VarCurr,bitIndex14))& (v756(VarCurr,bitIndex24)<->v770(VarCurr,bitIndex13))& (v756(VarCurr,bitIndex23)<->v770(VarCurr,bitIndex12))& (v756(VarCurr,bitIndex22)<->v770(VarCurr,bitIndex11))& (v756(VarCurr,bitIndex21)<->v770(VarCurr,bitIndex10))& (v756(VarCurr,bitIndex20)<->v770(VarCurr,bitIndex9))& (v756(VarCurr,bitIndex19)<->v770(VarCurr,bitIndex8))& (v756(VarCurr,bitIndex18)<->v770(VarCurr,bitIndex7))& (v756(VarCurr,bitIndex17)<->v770(VarCurr,bitIndex6))& (v756(VarCurr,bitIndex16)<->v770(VarCurr,bitIndex5))& (v756(VarCurr,bitIndex15)<->v770(VarCurr,bitIndex4))& (v756(VarCurr,bitIndex14)<->v770(VarCurr,bitIndex3))& (v756(VarCurr,bitIndex13)<->v770(VarCurr,bitIndex2))& (v756(VarCurr,bitIndex12)<->v770(VarCurr,bitIndex1))& (v756(VarCurr,bitIndex11)<->v770(VarCurr,bitIndex0))).
% 94.06/93.46  all VarCurr ((v756(VarCurr,bitIndex62)<->v764(VarCurr,bitIndex35))& (v756(VarCurr,bitIndex61)<->v764(VarCurr,bitIndex34))& (v756(VarCurr,bitIndex60)<->v764(VarCurr,bitIndex33))& (v756(VarCurr,bitIndex59)<->v764(VarCurr,bitIndex32))& (v756(VarCurr,bitIndex58)<->v764(VarCurr,bitIndex31))& (v756(VarCurr,bitIndex57)<->v764(VarCurr,bitIndex30))& (v756(VarCurr,bitIndex56)<->v764(VarCurr,bitIndex29))& (v756(VarCurr,bitIndex55)<->v764(VarCurr,bitIndex28))& (v756(VarCurr,bitIndex54)<->v764(VarCurr,bitIndex27))& (v756(VarCurr,bitIndex53)<->v764(VarCurr,bitIndex26))& (v756(VarCurr,bitIndex52)<->v764(VarCurr,bitIndex25))& (v756(VarCurr,bitIndex51)<->v764(VarCurr,bitIndex24))& (v756(VarCurr,bitIndex50)<->v764(VarCurr,bitIndex23))& (v756(VarCurr,bitIndex49)<->v764(VarCurr,bitIndex22))& (v756(VarCurr,bitIndex48)<->v764(VarCurr,bitIndex21))& (v756(VarCurr,bitIndex47)<->v764(VarCurr,bitIndex20))& (v756(VarCurr,bitIndex46)<->v764(VarCurr,bitIndex19))& (v756(VarCurr,bitIndex45)<->v764(VarCurr,bitIndex18))& (v756(VarCurr,bitIndex44)<->v764(VarCurr,bitIndex17))& (v756(VarCurr,bitIndex43)<->v764(VarCurr,bitIndex16))& (v756(VarCurr,bitIndex42)<->v764(VarCurr,bitIndex15))& (v756(VarCurr,bitIndex41)<->v764(VarCurr,bitIndex14))& (v756(VarCurr,bitIndex40)<->v764(VarCurr,bitIndex13))& (v756(VarCurr,bitIndex39)<->v764(VarCurr,bitIndex12))& (v756(VarCurr,bitIndex38)<->v764(VarCurr,bitIndex11))& (v756(VarCurr,bitIndex37)<->v764(VarCurr,bitIndex10))& (v756(VarCurr,bitIndex36)<->v764(VarCurr,bitIndex9))& (v756(VarCurr,bitIndex35)<->v764(VarCurr,bitIndex8))& (v756(VarCurr,bitIndex34)<->v764(VarCurr,bitIndex7))& (v756(VarCurr,bitIndex33)<->v764(VarCurr,bitIndex6))& (v756(VarCurr,bitIndex32)<->v764(VarCurr,bitIndex5))& (v756(VarCurr,bitIndex31)<->v764(VarCurr,bitIndex4))& (v756(VarCurr,bitIndex30)<->v764(VarCurr,bitIndex3))& (v756(VarCurr,bitIndex29)<->v764(VarCurr,bitIndex2))& (v756(VarCurr,bitIndex28)<->v764(VarCurr,bitIndex1))& (v756(VarCurr,bitIndex27)<->v764(VarCurr,bitIndex0))).
% 94.06/93.46  all VarCurr ((v756(VarCurr,bitIndex66)<->v758(VarCurr,bitIndex3))& (v756(VarCurr,bitIndex65)<->v758(VarCurr,bitIndex2))& (v756(VarCurr,bitIndex64)<->v758(VarCurr,bitIndex1))& (v756(VarCurr,bitIndex63)<->v758(VarCurr,bitIndex0))).
% 94.06/93.46  all VarCurr B (range_10_0(B)-> (v776(VarCurr,B)<->v778(VarCurr,B))).
% 94.06/93.46  all VarCurr B (range_10_0(B)-> (v778(VarCurr,B)<->v780(VarCurr,B))).
% 94.06/93.46  all B (range_10_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B).
% 94.06/93.46  all VarCurr B (range_15_0(B)-> (v770(VarCurr,B)<->v772(VarCurr,B))).
% 94.06/93.46  all VarCurr B (range_15_0(B)-> (v772(VarCurr,B)<->v774(VarCurr,B))).
% 94.06/93.46  all B (range_15_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B).
% 94.06/93.46  all VarCurr B (range_35_0(B)-> (v764(VarCurr,B)<->v766(VarCurr,B))).
% 94.06/93.46  all VarCurr B (range_35_0(B)-> (v766(VarCurr,B)<->v768(VarCurr,B))).
% 94.06/93.46  all B (range_35_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B).
% 94.06/93.46  all VarCurr B (range_3_0(B)-> (v758(VarCurr,B)<->v760(VarCurr,B))).
% 94.06/93.46  all VarCurr B (range_3_0(B)-> (v760(VarCurr,B)<->v762(VarCurr,B))).
% 94.06/93.46  all VarCurr (v754(VarCurr)<->v12(VarCurr)).
% 94.06/93.46  all VarCurr (v751(VarCurr)<->v288(VarCurr)).
% 94.06/93.46  all VarCurr (v664(VarCurr)<->v666(VarCurr)).
% 94.06/93.46  all VarCurr (v666(VarCurr)<->v668(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v724(VarNext)-> (v668(VarNext)<->v668(VarCurr)))).
% 94.06/93.46  all VarNext (v724(VarNext)-> (v668(VarNext)<->v734(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v734(VarNext)<->v732(VarCurr))).
% 94.06/93.46  all VarCurr (-v735(VarCurr)-> (v732(VarCurr)<->x697(VarCurr))).
% 94.06/93.46  all VarCurr (v735(VarCurr)-> (v732(VarCurr)<->v678(VarCurr))).
% 94.06/93.46  all VarCurr (v735(VarCurr)<->v736(VarCurr)&v737(VarCurr)).
% 94.06/93.46  all VarCurr (-v737(VarCurr)<->v674(VarCurr)).
% 94.06/93.46  all VarCurr (-v736(VarCurr)<->v670(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v724(VarNext)<->v725(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v725(VarNext)<->v726(VarNext)&v721(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v726(VarNext)<->v728(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v728(VarNext)<->v721(VarCurr))).
% 94.06/93.46  all VarCurr (v721(VarCurr)<->v701(VarCurr)).
% 94.06/93.46  all VarCurr (v678(VarCurr)<->v680(VarCurr)).
% 94.06/93.46  all VarCurr (v680(VarCurr)<->v682(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v704(VarNext)-> (v682(VarNext)<->v682(VarCurr)))).
% 94.06/93.46  all VarNext (v704(VarNext)-> (v682(VarNext)<->v714(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v714(VarNext)<->v712(VarCurr))).
% 94.06/93.46  all VarCurr (-v715(VarCurr)-> (v712(VarCurr)<->x697(VarCurr))).
% 94.06/93.46  all VarCurr (v715(VarCurr)-> (v712(VarCurr)<->v688(VarCurr))).
% 94.06/93.46  all VarCurr (v715(VarCurr)<->v716(VarCurr)&v717(VarCurr)).
% 94.06/93.46  all VarCurr (-v717(VarCurr)<->v686(VarCurr)).
% 94.06/93.46  all VarCurr (-v716(VarCurr)<->v684(VarCurr)).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v704(VarNext)<->v705(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v705(VarNext)<->v706(VarNext)&v699(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v706(VarNext)<->v708(VarNext))).
% 94.06/93.46  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v708(VarNext)<->v699(VarCurr))).
% 94.06/93.46  v682(constB0)<->$F.
% 94.06/93.46  all VarCurr (v699(VarCurr)<->v701(VarCurr)).
% 94.06/93.46  all VarCurr (v701(VarCurr)<->v288(VarCurr)).
% 94.06/93.46  all VarCurr (v688(VarCurr)<->v690(VarCurr)).
% 94.06/93.46  all VarCurr (v690(VarCurr)<->v692(VarCurr)).
% 94.06/93.46  all VarCurr (v692(VarCurr)<->v694(VarCurr)).
% 94.06/93.46  all VarCurr (v694(VarCurr)<->v696(VarCurr)).
% 94.06/93.46  all VarCurr (v686(VarCurr)<->v676(VarCurr)).
% 94.06/93.46  all VarCurr (v684(VarCurr)<->v672(VarCurr)).
% 94.06/93.46  all VarCurr (v674(VarCurr)<->v676(VarCurr)).
% 94.06/93.47  all VarCurr (v676(VarCurr)<->$F).
% 94.06/93.47  all VarCurr (v670(VarCurr)<->v672(VarCurr)).
% 94.06/93.47  all VarCurr (v672(VarCurr)<->$F).
% 94.06/93.47  all VarCurr (v320(VarCurr)<->v322(VarCurr)).
% 94.06/93.47  all VarCurr (v322(VarCurr)<->v324(VarCurr)).
% 94.06/93.47  all VarCurr (v324(VarCurr)<->v326(VarCurr)).
% 94.06/93.47  all VarCurr (v326(VarCurr)<->v328(VarCurr)).
% 94.06/93.47  all VarCurr (v328(VarCurr)<->v330(VarCurr)).
% 94.06/93.47  all VarCurr (v330(VarCurr)<->v332(VarCurr)).
% 94.06/93.47  all VarCurr (-v657(VarCurr)-> (v332(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v657(VarCurr)-> (v332(VarCurr)<->v658(VarCurr))).
% 94.06/93.47  all VarCurr (-v510(VarCurr)-> (v658(VarCurr)<->v661(VarCurr))).
% 94.06/93.47  all VarCurr (v510(VarCurr)-> (v658(VarCurr)<->v659(VarCurr))).
% 94.06/93.47  all VarCurr (-v513(VarCurr)-> (v661(VarCurr)<->v662(VarCurr))).
% 94.06/93.47  all VarCurr (v513(VarCurr)-> (v661(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (-v517(VarCurr)-> (v662(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v517(VarCurr)-> (v662(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (-v509(VarCurr)-> (v659(VarCurr)<->v660(VarCurr))).
% 94.06/93.47  all VarCurr (v509(VarCurr)-> (v659(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (-v539(VarCurr)-> (v660(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v539(VarCurr)-> (v660(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (v657(VarCurr)<->v510(VarCurr)|v514(VarCurr)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v645(VarNext)-> (v334(VarNext,bitIndex0)<->v334(VarCurr,bitIndex0)))).
% 94.06/93.47  all VarNext (v645(VarNext)-> (v334(VarNext,bitIndex0)<->v653(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v653(VarNext)<->v651(VarCurr))).
% 94.06/93.47  all VarCurr (-v531(VarCurr)-> (v651(VarCurr)<->v342(VarCurr,bitIndex0))).
% 94.06/93.47  all VarCurr (v531(VarCurr)-> (v651(VarCurr)<->$T)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v645(VarNext)<->v646(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v646(VarNext)<->v648(VarNext)&v484(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v648(VarNext)<->v524(VarNext))).
% 94.06/93.47  all VarCurr (-v637(VarCurr)-> (v342(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarCurr (v637(VarCurr)-> (v342(VarCurr,bitIndex0)<->v641(VarCurr))).
% 94.06/93.47  all VarCurr (-v638(VarCurr)-> (v641(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (v638(VarCurr)-> (v641(VarCurr)<->v642(VarCurr))).
% 94.06/93.47  all VarCurr (-v539(VarCurr)-> (v642(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (v539(VarCurr)-> (v642(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (v637(VarCurr)<->v638(VarCurr)|v640(VarCurr)).
% 94.06/93.47  all VarCurr (v640(VarCurr)<->v513(VarCurr)&v514(VarCurr)).
% 94.06/93.47  all VarCurr (v638(VarCurr)<->v639(VarCurr)&v510(VarCurr)).
% 94.06/93.47  all VarCurr (-v639(VarCurr)<->v509(VarCurr)).
% 94.06/93.47  all VarCurr (v344(VarCurr)<->v346(VarCurr)).
% 94.06/93.47  all VarCurr (v346(VarCurr)<-> (v348(VarCurr,bitIndex4)<->$F)& (v348(VarCurr,bitIndex3)<->$F)& (v348(VarCurr,bitIndex2)<->$F)& (v348(VarCurr,bitIndex1)<->$F)& (v348(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v621(VarNext)-> (all B (range_4_0(B)-> (v348(VarNext,B)<->v348(VarCurr,B)))))).
% 94.06/93.47  all VarNext (v621(VarNext)-> (all B (range_4_0(B)-> (v348(VarNext,B)<->v631(VarNext,B))))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v631(VarNext,B)<->v629(VarCurr,B))))).
% 94.06/93.47  all VarCurr (-v632(VarCurr)-> (all B (range_4_0(B)-> (v629(VarCurr,B)<->v352(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v632(VarCurr)-> (all B (range_4_0(B)-> (v629(VarCurr,B)<->$F)))).
% 94.06/93.47  all VarCurr (-v632(VarCurr)<->v350(VarCurr)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v621(VarNext)<->v622(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v622(VarNext)<->v623(VarNext)&v618(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v623(VarNext)<->v625(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v625(VarNext)<->v618(VarCurr))).
% 94.06/93.47  all VarCurr (v618(VarCurr)<->v484(VarCurr)).
% 94.06/93.47  all VarCurr (-v543(VarCurr)& -v545(VarCurr)& -v586(VarCurr)-> (all B (range_4_0(B)-> (v352(VarCurr,B)<->v348(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v586(VarCurr)-> (all B (range_4_0(B)-> (v352(VarCurr,B)<->v588(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v545(VarCurr)-> (all B (range_4_0(B)-> (v352(VarCurr,B)<->v547(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v543(VarCurr)-> (all B (range_4_0(B)-> (v352(VarCurr,B)<->v348(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v615(VarCurr)<-> (v616(VarCurr,bitIndex1)<->$T)& (v616(VarCurr,bitIndex0)<->$T)).
% 94.06/93.47  all VarCurr (v616(VarCurr,bitIndex0)<->v377(VarCurr)).
% 94.06/93.47  all VarCurr (v616(VarCurr,bitIndex1)<->v354(VarCurr)).
% 94.06/93.47  all VarCurr (-v589(VarCurr)-> (all B (range_4_0(B)-> (v588(VarCurr,B)<->v590(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v589(VarCurr)-> (all B (range_4_0(B)-> (v588(VarCurr,B)<->b01111(B))))).
% 94.06/93.47  all VarCurr (v590(VarCurr,bitIndex0)<->v612(VarCurr)).
% 94.06/93.47  all VarCurr (v590(VarCurr,bitIndex1)<->v610(VarCurr)).
% 94.06/93.47  all VarCurr (v590(VarCurr,bitIndex2)<->v605(VarCurr)).
% 94.06/93.47  all VarCurr (v590(VarCurr,bitIndex3)<->v600(VarCurr)).
% 94.06/93.47  all VarCurr (v590(VarCurr,bitIndex4)<->v592(VarCurr)).
% 94.06/93.47  all VarCurr (v610(VarCurr)<->v611(VarCurr)&v614(VarCurr)).
% 94.06/93.47  all VarCurr (v614(VarCurr)<->v348(VarCurr,bitIndex0)|v348(VarCurr,bitIndex1)).
% 94.06/93.47  all VarCurr (v611(VarCurr)<->v612(VarCurr)|v613(VarCurr)).
% 94.06/93.47  all VarCurr (-v613(VarCurr)<->v348(VarCurr,bitIndex1)).
% 94.06/93.47  all VarCurr (-v612(VarCurr)<->v348(VarCurr,bitIndex0)).
% 94.06/93.47  all VarCurr (v605(VarCurr)<->v606(VarCurr)&v609(VarCurr)).
% 94.06/93.47  all VarCurr (v609(VarCurr)<->v597(VarCurr)|v348(VarCurr,bitIndex2)).
% 94.06/93.47  all VarCurr (v606(VarCurr)<->v607(VarCurr)|v608(VarCurr)).
% 94.06/93.47  all VarCurr (-v608(VarCurr)<->v348(VarCurr,bitIndex2)).
% 94.06/93.47  all VarCurr (-v607(VarCurr)<->v597(VarCurr)).
% 94.06/93.47  all VarCurr (v600(VarCurr)<->v601(VarCurr)&v604(VarCurr)).
% 94.06/93.47  all VarCurr (v604(VarCurr)<->v596(VarCurr)|v348(VarCurr,bitIndex3)).
% 94.06/93.47  all VarCurr (v601(VarCurr)<->v602(VarCurr)|v603(VarCurr)).
% 94.06/93.47  all VarCurr (-v603(VarCurr)<->v348(VarCurr,bitIndex3)).
% 94.06/93.47  all VarCurr (-v602(VarCurr)<->v596(VarCurr)).
% 94.06/93.47  all VarCurr (v592(VarCurr)<->v593(VarCurr)&v599(VarCurr)).
% 94.06/93.47  all VarCurr (v599(VarCurr)<->v595(VarCurr)|v348(VarCurr,bitIndex4)).
% 94.06/93.47  all VarCurr (v593(VarCurr)<->v594(VarCurr)|v598(VarCurr)).
% 94.06/93.47  all VarCurr (-v598(VarCurr)<->v348(VarCurr,bitIndex4)).
% 94.06/93.47  all VarCurr (-v594(VarCurr)<->v595(VarCurr)).
% 94.06/93.47  all VarCurr (v595(VarCurr)<->v596(VarCurr)&v348(VarCurr,bitIndex3)).
% 94.06/93.47  all VarCurr (v596(VarCurr)<->v597(VarCurr)&v348(VarCurr,bitIndex2)).
% 94.06/93.47  all VarCurr (v597(VarCurr)<->v348(VarCurr,bitIndex0)&v348(VarCurr,bitIndex1)).
% 94.06/93.47  all VarCurr (v589(VarCurr)<-> (v348(VarCurr,bitIndex4)<->$F)& (v348(VarCurr,bitIndex3)<->$T)& (v348(VarCurr,bitIndex2)<->$T)& (v348(VarCurr,bitIndex1)<->$T)& (v348(VarCurr,bitIndex0)<->$T)).
% 94.06/93.47  all VarCurr (v586(VarCurr)<-> (v587(VarCurr,bitIndex1)<->$T)& (v587(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarCurr (v587(VarCurr,bitIndex0)<->v377(VarCurr)).
% 94.06/93.47  all VarCurr (v587(VarCurr,bitIndex1)<->v354(VarCurr)).
% 94.06/93.47  all VarCurr (-v548(VarCurr)-> (all B (range_31_0(B)-> (v547(VarCurr,B)<->v549(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v548(VarCurr)-> (all B (range_31_0(B)-> (v547(VarCurr,B)<->$F)))).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex6)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex7)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex8)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex9)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex10)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex11)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex12)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex13)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex14)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex15)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex16)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex17)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex18)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex19)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex20)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex21)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex22)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex23)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex24)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex25)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex26)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex27)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex28)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex29)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex30)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v549(VarCurr,bitIndex31)<->v550(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr B (range_5_0(B)-> (v549(VarCurr,B)<->v550(VarCurr,B))).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex0)<->v584(VarCurr)).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex1)<->v582(VarCurr)).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex2)<->v578(VarCurr)).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex3)<->v574(VarCurr)).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex4)<->v570(VarCurr)).
% 94.06/93.47  all VarCurr (v550(VarCurr,bitIndex5)<->v552(VarCurr)).
% 94.06/93.47  all VarCurr (v582(VarCurr)<->v583(VarCurr)&v585(VarCurr)).
% 94.06/93.47  all VarCurr (v585(VarCurr)<->v556(VarCurr,bitIndex0)|v564(VarCurr)).
% 94.06/93.47  all VarCurr (v583(VarCurr)<->v584(VarCurr)|v556(VarCurr,bitIndex1)).
% 94.06/93.47  all VarCurr (-v584(VarCurr)<->v556(VarCurr,bitIndex0)).
% 94.06/93.47  all VarCurr (v578(VarCurr)<->v579(VarCurr)&v581(VarCurr)).
% 94.06/93.47  all VarCurr (v581(VarCurr)<->v562(VarCurr)|v565(VarCurr)).
% 94.06/93.47  all VarCurr (v579(VarCurr)<->v580(VarCurr)|v556(VarCurr,bitIndex2)).
% 94.06/93.47  all VarCurr (-v580(VarCurr)<->v562(VarCurr)).
% 94.06/93.47  all VarCurr (v574(VarCurr)<->v575(VarCurr)&v577(VarCurr)).
% 94.06/93.47  all VarCurr (v577(VarCurr)<->v560(VarCurr)|v566(VarCurr)).
% 94.06/93.47  all VarCurr (v575(VarCurr)<->v576(VarCurr)|v556(VarCurr,bitIndex3)).
% 94.06/93.47  all VarCurr (-v576(VarCurr)<->v560(VarCurr)).
% 94.06/93.47  all VarCurr (v570(VarCurr)<->v571(VarCurr)&v573(VarCurr)).
% 94.06/93.47  all VarCurr (v573(VarCurr)<->v558(VarCurr)|v567(VarCurr)).
% 94.06/93.47  all VarCurr (v571(VarCurr)<->v572(VarCurr)|v556(VarCurr,bitIndex4)).
% 94.06/93.47  all VarCurr (-v572(VarCurr)<->v558(VarCurr)).
% 94.06/93.47  all VarCurr (v552(VarCurr)<->v553(VarCurr)&v568(VarCurr)).
% 94.06/93.47  all VarCurr (v568(VarCurr)<->v555(VarCurr)|v569(VarCurr)).
% 94.06/93.47  all VarCurr (-v569(VarCurr)<->v556(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (v553(VarCurr)<->v554(VarCurr)|v556(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr (-v554(VarCurr)<->v555(VarCurr)).
% 94.06/93.47  all VarCurr (v555(VarCurr)<->v556(VarCurr,bitIndex4)|v557(VarCurr)).
% 94.06/93.47  all VarCurr (v557(VarCurr)<->v558(VarCurr)&v567(VarCurr)).
% 94.06/93.47  all VarCurr (-v567(VarCurr)<->v556(VarCurr,bitIndex4)).
% 94.06/93.47  all VarCurr (v558(VarCurr)<->v556(VarCurr,bitIndex3)|v559(VarCurr)).
% 94.06/93.47  all VarCurr (v559(VarCurr)<->v560(VarCurr)&v566(VarCurr)).
% 94.06/93.47  all VarCurr (-v566(VarCurr)<->v556(VarCurr,bitIndex3)).
% 94.06/93.47  all VarCurr (v560(VarCurr)<->v556(VarCurr,bitIndex2)|v561(VarCurr)).
% 94.06/93.47  all VarCurr (v561(VarCurr)<->v562(VarCurr)&v565(VarCurr)).
% 94.06/93.47  all VarCurr (-v565(VarCurr)<->v556(VarCurr,bitIndex2)).
% 94.06/93.47  all VarCurr (v562(VarCurr)<->v556(VarCurr,bitIndex1)|v563(VarCurr)).
% 94.06/93.47  all VarCurr (v563(VarCurr)<->v556(VarCurr,bitIndex0)&v564(VarCurr)).
% 94.06/93.47  all VarCurr (-v564(VarCurr)<->v556(VarCurr,bitIndex1)).
% 94.06/93.47  all VarCurr (-v556(VarCurr,bitIndex5)).
% 94.06/93.47  all VarCurr B (range_4_0(B)-> (v556(VarCurr,B)<->v348(VarCurr,B))).
% 94.06/93.47  all VarCurr (v548(VarCurr)<-> (v348(VarCurr,bitIndex4)<->$F)& (v348(VarCurr,bitIndex3)<->$F)& (v348(VarCurr,bitIndex2)<->$F)& (v348(VarCurr,bitIndex1)<->$F)& (v348(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarCurr (v545(VarCurr)<-> (v546(VarCurr,bitIndex1)<->$F)& (v546(VarCurr,bitIndex0)<->$T)).
% 94.06/93.47  all VarCurr (v546(VarCurr,bitIndex0)<->v377(VarCurr)).
% 94.06/93.47  all VarCurr (v546(VarCurr,bitIndex1)<->v354(VarCurr)).
% 94.06/93.47  all B (range_4_0(B)-> (v348(constB0,B)<->$F)).
% 94.06/93.47  all VarCurr (v543(VarCurr)<-> (v544(VarCurr,bitIndex1)<->$F)& (v544(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarCurr (v544(VarCurr,bitIndex0)<->v377(VarCurr)).
% 94.06/93.47  all VarCurr (v544(VarCurr,bitIndex1)<->v354(VarCurr)).
% 94.06/93.47  all VarCurr (v377(VarCurr)<->v379(VarCurr)).
% 94.06/93.47  all VarCurr (-v535(VarCurr)-> (v379(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v535(VarCurr)-> (v379(VarCurr)<->v536(VarCurr))).
% 94.06/93.47  all VarCurr (-v510(VarCurr)-> (v536(VarCurr)<->v540(VarCurr))).
% 94.06/93.47  all VarCurr (v510(VarCurr)-> (v536(VarCurr)<->v537(VarCurr))).
% 94.06/93.47  all VarCurr (-v513(VarCurr)-> (v540(VarCurr)<->v541(VarCurr))).
% 94.06/93.47  all VarCurr (v513(VarCurr)-> (v540(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (-v517(VarCurr)-> (v541(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v517(VarCurr)-> (v541(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (-v509(VarCurr)-> (v537(VarCurr)<->v538(VarCurr))).
% 94.06/93.47  all VarCurr (v509(VarCurr)-> (v537(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (-v539(VarCurr)-> (v538(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (v539(VarCurr)-> (v538(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (-v539(VarCurr)<->v381(VarCurr)).
% 94.06/93.47  all VarCurr (v535(VarCurr)<->v510(VarCurr)|v514(VarCurr)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v520(VarNext)-> (v334(VarNext,bitIndex1)<->v334(VarCurr,bitIndex1)))).
% 94.06/93.47  all VarNext (v520(VarNext)-> (v334(VarNext,bitIndex1)<->v530(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v530(VarNext)<->v528(VarCurr))).
% 94.06/93.47  all VarCurr (-v531(VarCurr)-> (v528(VarCurr)<->v342(VarCurr,bitIndex1))).
% 94.06/93.47  all VarCurr (v531(VarCurr)-> (v528(VarCurr)<->$F)).
% 94.06/93.47  all VarCurr (-v531(VarCurr)<->v336(VarCurr)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v520(VarNext)<->v521(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v521(VarNext)<->v522(VarNext)&v484(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v522(VarNext)<->v524(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v524(VarNext)<->v484(VarCurr))).
% 94.06/93.47  all VarCurr (-v507(VarCurr)-> (v342(VarCurr,bitIndex1)<->$F)).
% 94.06/93.47  all VarCurr (v507(VarCurr)-> (v342(VarCurr,bitIndex1)<->v515(VarCurr))).
% 94.06/93.47  all VarCurr (-v508(VarCurr)-> (v515(VarCurr)<->v516(VarCurr))).
% 94.06/93.47  all VarCurr (v508(VarCurr)-> (v515(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (-v517(VarCurr)-> (v516(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (v517(VarCurr)-> (v516(VarCurr)<->$T)).
% 94.06/93.47  all VarCurr (-v517(VarCurr)<->v344(VarCurr)).
% 94.06/93.47  all VarCurr (v507(VarCurr)<->v508(VarCurr)|v511(VarCurr)).
% 94.06/93.47  all VarCurr (v511(VarCurr)<->v512(VarCurr)&v514(VarCurr)).
% 94.06/93.47  all VarCurr (v514(VarCurr)<-> ($T<->v334(VarCurr,bitIndex1))).
% 94.06/93.47  all VarCurr (-v512(VarCurr)<->v513(VarCurr)).
% 94.06/93.47  all VarCurr (-v513(VarCurr)<->v381(VarCurr)).
% 94.06/93.47  all VarCurr (v508(VarCurr)<->v509(VarCurr)&v510(VarCurr)).
% 94.06/93.47  all VarCurr (v510(VarCurr)<-> ($T<->v334(VarCurr,bitIndex0))).
% 94.06/93.47  v334(constB0,bitIndex1)<->$F.
% 94.06/93.47  v334(constB0,bitIndex0)<->$T.
% 94.06/93.47  all VarCurr (-v509(VarCurr)<->v344(VarCurr)).
% 94.06/93.47  all VarCurr (v381(VarCurr)<->v383(VarCurr)).
% 94.06/93.47  all VarCurr (v383(VarCurr)<-> (v385(VarCurr,bitIndex4)<->$F)& (v385(VarCurr,bitIndex3)<->$F)& (v385(VarCurr,bitIndex2)<->$F)& (v385(VarCurr,bitIndex1)<->$F)& (v385(VarCurr,bitIndex0)<->$F)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v491(VarNext)-> (all B (range_4_0(B)-> (v385(VarNext,B)<->v385(VarCurr,B)))))).
% 94.06/93.47  all VarNext (v491(VarNext)-> (all B (range_4_0(B)-> (v385(VarNext,B)<->v501(VarNext,B))))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_4_0(B)-> (v501(VarNext,B)<->v499(VarCurr,B))))).
% 94.06/93.47  all VarCurr (-v502(VarCurr)-> (all B (range_4_0(B)-> (v499(VarCurr,B)<->v389(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v502(VarCurr)-> (all B (range_4_0(B)-> (v499(VarCurr,B)<->$F)))).
% 94.06/93.47  all VarCurr (-v502(VarCurr)<->v387(VarCurr)).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v491(VarNext)<->v492(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v492(VarNext)<->v493(VarNext)&v482(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v493(VarNext)<->v495(VarNext))).
% 94.06/93.47  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v495(VarNext)<->v482(VarCurr))).
% 94.06/93.47  all VarCurr (v482(VarCurr)<->v484(VarCurr)).
% 94.06/93.47  all VarCurr (v484(VarCurr)<->v486(VarCurr)).
% 94.06/93.47  all VarCurr (v486(VarCurr)<->v488(VarCurr)).
% 94.06/93.47  all VarCurr (v488(VarCurr)<->v1(VarCurr)).
% 94.06/93.47  all VarCurr (-v407(VarCurr)& -v409(VarCurr)& -v450(VarCurr)-> (all B (range_4_0(B)-> (v389(VarCurr,B)<->v385(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v450(VarCurr)-> (all B (range_4_0(B)-> (v389(VarCurr,B)<->v452(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v409(VarCurr)-> (all B (range_4_0(B)-> (v389(VarCurr,B)<->v411(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v407(VarCurr)-> (all B (range_4_0(B)-> (v389(VarCurr,B)<->v385(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v479(VarCurr)<-> (v480(VarCurr,bitIndex1)<->$T)& (v480(VarCurr,bitIndex0)<->$T)).
% 94.06/93.47  all VarCurr (v480(VarCurr,bitIndex0)<->v403(VarCurr)).
% 94.06/93.47  all VarCurr (v480(VarCurr,bitIndex1)<->v391(VarCurr)).
% 94.06/93.47  all VarCurr (-v453(VarCurr)-> (all B (range_4_0(B)-> (v452(VarCurr,B)<->v454(VarCurr,B))))).
% 94.06/93.47  all VarCurr (v453(VarCurr)-> (all B (range_4_0(B)-> (v452(VarCurr,B)<->b01111(B))))).
% 94.06/93.47  all VarCurr (v454(VarCurr,bitIndex0)<->v476(VarCurr)).
% 94.06/93.47  all VarCurr (v454(VarCurr,bitIndex1)<->v474(VarCurr)).
% 94.06/93.47  all VarCurr (v454(VarCurr,bitIndex2)<->v469(VarCurr)).
% 94.06/93.47  all VarCurr (v454(VarCurr,bitIndex3)<->v464(VarCurr)).
% 94.06/93.47  all VarCurr (v454(VarCurr,bitIndex4)<->v456(VarCurr)).
% 94.06/93.47  all VarCurr (v474(VarCurr)<->v475(VarCurr)&v478(VarCurr)).
% 94.06/93.48  all VarCurr (v478(VarCurr)<->v385(VarCurr,bitIndex0)|v385(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (v475(VarCurr)<->v476(VarCurr)|v477(VarCurr)).
% 94.06/93.48  all VarCurr (-v477(VarCurr)<->v385(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (-v476(VarCurr)<->v385(VarCurr,bitIndex0)).
% 94.06/93.48  all VarCurr (v469(VarCurr)<->v470(VarCurr)&v473(VarCurr)).
% 94.06/93.48  all VarCurr (v473(VarCurr)<->v461(VarCurr)|v385(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (v470(VarCurr)<->v471(VarCurr)|v472(VarCurr)).
% 94.06/93.48  all VarCurr (-v472(VarCurr)<->v385(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (-v471(VarCurr)<->v461(VarCurr)).
% 94.06/93.48  all VarCurr (v464(VarCurr)<->v465(VarCurr)&v468(VarCurr)).
% 94.06/93.48  all VarCurr (v468(VarCurr)<->v460(VarCurr)|v385(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (v465(VarCurr)<->v466(VarCurr)|v467(VarCurr)).
% 94.06/93.48  all VarCurr (-v467(VarCurr)<->v385(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (-v466(VarCurr)<->v460(VarCurr)).
% 94.06/93.48  all VarCurr (v456(VarCurr)<->v457(VarCurr)&v463(VarCurr)).
% 94.06/93.48  all VarCurr (v463(VarCurr)<->v459(VarCurr)|v385(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v457(VarCurr)<->v458(VarCurr)|v462(VarCurr)).
% 94.06/93.48  all VarCurr (-v462(VarCurr)<->v385(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (-v458(VarCurr)<->v459(VarCurr)).
% 94.06/93.48  all VarCurr (v459(VarCurr)<->v460(VarCurr)&v385(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (v460(VarCurr)<->v461(VarCurr)&v385(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (v461(VarCurr)<->v385(VarCurr,bitIndex0)&v385(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (v453(VarCurr)<-> (v385(VarCurr,bitIndex4)<->$F)& (v385(VarCurr,bitIndex3)<->$T)& (v385(VarCurr,bitIndex2)<->$T)& (v385(VarCurr,bitIndex1)<->$T)& (v385(VarCurr,bitIndex0)<->$T)).
% 94.06/93.48  -b01111(bitIndex4).
% 94.06/93.48  b01111(bitIndex3).
% 94.06/93.48  b01111(bitIndex2).
% 94.06/93.48  b01111(bitIndex1).
% 94.06/93.48  b01111(bitIndex0).
% 94.06/93.48  all VarCurr (v450(VarCurr)<-> (v451(VarCurr,bitIndex1)<->$T)& (v451(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  all VarCurr (v451(VarCurr,bitIndex0)<->v403(VarCurr)).
% 94.06/93.48  all VarCurr (v451(VarCurr,bitIndex1)<->v391(VarCurr)).
% 94.06/93.48  all VarCurr (-v412(VarCurr)-> (all B (range_31_0(B)-> (v411(VarCurr,B)<->v413(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v412(VarCurr)-> (all B (range_31_0(B)-> (v411(VarCurr,B)<->$F)))).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex6)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex7)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex8)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex9)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex10)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex11)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex12)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex13)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex14)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex15)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex16)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex17)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex18)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex19)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex20)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex21)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex22)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex23)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex24)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex25)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex26)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex27)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex28)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex29)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex30)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v413(VarCurr,bitIndex31)<->v414(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr B (range_5_0(B)-> (v413(VarCurr,B)<->v414(VarCurr,B))).
% 94.06/93.48  all B (range_5_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex0)<->v448(VarCurr)).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex1)<->v446(VarCurr)).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex2)<->v442(VarCurr)).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex3)<->v438(VarCurr)).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex4)<->v434(VarCurr)).
% 94.06/93.48  all VarCurr (v414(VarCurr,bitIndex5)<->v416(VarCurr)).
% 94.06/93.48  all VarCurr (v446(VarCurr)<->v447(VarCurr)&v449(VarCurr)).
% 94.06/93.48  all VarCurr (v449(VarCurr)<->v420(VarCurr,bitIndex0)|v428(VarCurr)).
% 94.06/93.48  all VarCurr (v447(VarCurr)<->v448(VarCurr)|v420(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (-v448(VarCurr)<->v420(VarCurr,bitIndex0)).
% 94.06/93.48  all VarCurr (v442(VarCurr)<->v443(VarCurr)&v445(VarCurr)).
% 94.06/93.48  all VarCurr (v445(VarCurr)<->v426(VarCurr)|v429(VarCurr)).
% 94.06/93.48  all VarCurr (v443(VarCurr)<->v444(VarCurr)|v420(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (-v444(VarCurr)<->v426(VarCurr)).
% 94.06/93.48  all VarCurr (v438(VarCurr)<->v439(VarCurr)&v441(VarCurr)).
% 94.06/93.48  all VarCurr (v441(VarCurr)<->v424(VarCurr)|v430(VarCurr)).
% 94.06/93.48  all VarCurr (v439(VarCurr)<->v440(VarCurr)|v420(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (-v440(VarCurr)<->v424(VarCurr)).
% 94.06/93.48  all VarCurr (v434(VarCurr)<->v435(VarCurr)&v437(VarCurr)).
% 94.06/93.48  all VarCurr (v437(VarCurr)<->v422(VarCurr)|v431(VarCurr)).
% 94.06/93.48  all VarCurr (v435(VarCurr)<->v436(VarCurr)|v420(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (-v436(VarCurr)<->v422(VarCurr)).
% 94.06/93.48  all VarCurr (v416(VarCurr)<->v417(VarCurr)&v432(VarCurr)).
% 94.06/93.48  all VarCurr (v432(VarCurr)<->v419(VarCurr)|v433(VarCurr)).
% 94.06/93.48  all VarCurr (-v433(VarCurr)<->v420(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (v417(VarCurr)<->v418(VarCurr)|v420(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr (-v418(VarCurr)<->v419(VarCurr)).
% 94.06/93.48  all VarCurr (v419(VarCurr)<->v420(VarCurr,bitIndex4)|v421(VarCurr)).
% 94.06/93.48  all VarCurr (v421(VarCurr)<->v422(VarCurr)&v431(VarCurr)).
% 94.06/93.48  all VarCurr (-v431(VarCurr)<->v420(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v422(VarCurr)<->v420(VarCurr,bitIndex3)|v423(VarCurr)).
% 94.06/93.48  all VarCurr (v423(VarCurr)<->v424(VarCurr)&v430(VarCurr)).
% 94.06/93.48  all VarCurr (-v430(VarCurr)<->v420(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (v424(VarCurr)<->v420(VarCurr,bitIndex2)|v425(VarCurr)).
% 94.06/93.48  all VarCurr (v425(VarCurr)<->v426(VarCurr)&v429(VarCurr)).
% 94.06/93.48  all VarCurr (-v429(VarCurr)<->v420(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (v426(VarCurr)<->v420(VarCurr,bitIndex1)|v427(VarCurr)).
% 94.06/93.48  all VarCurr (v427(VarCurr)<->v420(VarCurr,bitIndex0)&v428(VarCurr)).
% 94.06/93.48  all VarCurr (-v428(VarCurr)<->v420(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (-v420(VarCurr,bitIndex5)).
% 94.06/93.48  all VarCurr B (range_4_0(B)-> (v420(VarCurr,B)<->v385(VarCurr,B))).
% 94.06/93.48  all VarCurr (v412(VarCurr)<-> (v385(VarCurr,bitIndex4)<->$F)& (v385(VarCurr,bitIndex3)<->$F)& (v385(VarCurr,bitIndex2)<->$F)& (v385(VarCurr,bitIndex1)<->$F)& (v385(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  all VarCurr (v409(VarCurr)<-> (v410(VarCurr,bitIndex1)<->$F)& (v410(VarCurr,bitIndex0)<->$T)).
% 94.06/93.48  all VarCurr (v410(VarCurr,bitIndex0)<->v403(VarCurr)).
% 94.06/93.48  all VarCurr (v410(VarCurr,bitIndex1)<->v391(VarCurr)).
% 94.06/93.48  all B (range_4_0(B)-> (v385(constB0,B)<->$F)).
% 94.06/93.48  all VarCurr (v407(VarCurr)<-> (v408(VarCurr,bitIndex1)<->$F)& (v408(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  all VarCurr (v408(VarCurr,bitIndex0)<->v403(VarCurr)).
% 94.06/93.48  all VarCurr (v408(VarCurr,bitIndex1)<->v391(VarCurr)).
% 94.06/93.48  all VarCurr (v403(VarCurr)<->v332(VarCurr)).
% 94.06/93.48  all VarCurr (v391(VarCurr)<->v393(VarCurr)).
% 94.06/93.48  all VarCurr (v393(VarCurr)<->v395(VarCurr)).
% 94.06/93.48  all VarCurr (v395(VarCurr)<->v397(VarCurr)).
% 94.06/93.48  all VarCurr (v397(VarCurr)<->v399(VarCurr)).
% 94.06/93.48  all VarCurr (v399(VarCurr)<->v401(VarCurr)).
% 94.06/93.48  all VarCurr (v387(VarCurr)<->v336(VarCurr)).
% 94.06/93.48  all VarCurr (v354(VarCurr)<->v356(VarCurr)).
% 94.06/93.48  all VarCurr (-v374(VarCurr)-> (v356(VarCurr)<->$F)).
% 94.06/93.48  all VarCurr (v374(VarCurr)-> (v356(VarCurr)<->$T)).
% 94.06/93.48  all VarCurr (v374(VarCurr)<->v375(VarCurr)&v366(VarCurr)).
% 94.06/93.48  all VarCurr (-v375(VarCurr)<->v358(VarCurr,bitIndex8)).
% 94.06/93.48  all VarCurr (v366(VarCurr)<->v368(VarCurr)).
% 94.06/93.48  all VarCurr (v368(VarCurr)<->v370(VarCurr)).
% 94.06/93.48  all VarCurr (v370(VarCurr)<->v372(VarCurr)).
% 94.06/93.48  all VarCurr (v358(VarCurr,bitIndex8)<->v360(VarCurr,bitIndex8)).
% 94.06/93.48  all VarCurr (v360(VarCurr,bitIndex8)<->v362(VarCurr,bitIndex8)).
% 94.06/93.48  all VarCurr (v362(VarCurr,bitIndex8)<->v364(VarCurr,bitIndex8)).
% 94.06/93.48  all VarCurr (v350(VarCurr)<->v336(VarCurr)).
% 94.06/93.48  all VarCurr (v336(VarCurr)<->v338(VarCurr)).
% 94.06/93.48  all VarCurr (v338(VarCurr)<->v340(VarCurr)).
% 94.06/93.48  all VarCurr (v340(VarCurr)<->v16(VarCurr)).
% 94.06/93.48  all VarCurr (v99(VarCurr)<->v101(VarCurr)).
% 94.06/93.48  all VarCurr (-v101(VarCurr)<->v103(VarCurr)).
% 94.06/93.48  all VarCurr (v103(VarCurr)<->v105(VarCurr)).
% 94.06/93.48  all VarCurr (v105(VarCurr)<->v107(VarCurr)).
% 94.06/93.48  all VarCurr (v107(VarCurr)<-> (v109(VarCurr,bitIndex3)<->$F)& (v109(VarCurr,bitIndex2)<->$F)& (v109(VarCurr,bitIndex1)<->$F)& (v109(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v291(VarNext)-> (all B (range_3_0(B)-> (v109(VarNext,B)<->v109(VarCurr,B)))))).
% 94.06/93.48  all VarNext (v291(VarNext)-> (all B (range_3_0(B)-> (v109(VarNext,B)<->v301(VarNext,B))))).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (all B (range_3_0(B)-> (v301(VarNext,B)<->v299(VarCurr,B))))).
% 94.06/93.48  all VarCurr (-v302(VarCurr)-> (all B (range_3_0(B)-> (v299(VarCurr,B)<->v111(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v302(VarCurr)-> (all B (range_3_0(B)-> (v299(VarCurr,B)<->$F)))).
% 94.06/93.48  all VarCurr (-v302(VarCurr)<->v10(VarCurr)).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v291(VarNext)<->v292(VarNext))).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v292(VarNext)<->v293(VarNext)&v286(VarNext))).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v293(VarNext)<->v295(VarNext))).
% 94.06/93.48  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v295(VarNext)<->v286(VarCurr))).
% 94.06/93.48  all VarCurr (v286(VarCurr)<->v288(VarCurr)).
% 94.06/93.48  all VarCurr (v288(VarCurr)<->v197(VarCurr)).
% 94.06/93.48  all VarCurr (-v223(VarCurr)& -v225(VarCurr)& -v260(VarCurr)-> (all B (range_3_0(B)-> (v111(VarCurr,B)<->v109(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v260(VarCurr)-> (all B (range_3_0(B)-> (v111(VarCurr,B)<->v262(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v225(VarCurr)-> (all B (range_3_0(B)-> (v111(VarCurr,B)<->v227(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v223(VarCurr)-> (all B (range_3_0(B)-> (v111(VarCurr,B)<->v109(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v283(VarCurr)<-> (v284(VarCurr,bitIndex1)<->$T)& (v284(VarCurr,bitIndex0)<->$T)).
% 94.06/93.48  b11(bitIndex1).
% 94.06/93.48  b11(bitIndex0).
% 94.06/93.48  all VarCurr (v284(VarCurr,bitIndex0)<->v23(VarCurr)).
% 94.06/93.48  all VarCurr (v284(VarCurr,bitIndex1)<->v113(VarCurr)).
% 94.06/93.48  all VarCurr (-v263(VarCurr)-> (all B (range_3_0(B)-> (v262(VarCurr,B)<->v264(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v263(VarCurr)-> (all B (range_3_0(B)-> (v262(VarCurr,B)<->b0110(B))))).
% 94.06/93.48  all VarCurr (v264(VarCurr,bitIndex0)<->v280(VarCurr)).
% 94.06/93.48  all VarCurr (v264(VarCurr,bitIndex1)<->v278(VarCurr)).
% 94.06/93.48  all VarCurr (v264(VarCurr,bitIndex2)<->v273(VarCurr)).
% 94.06/93.48  all VarCurr (v264(VarCurr,bitIndex3)<->v266(VarCurr)).
% 94.06/93.48  all VarCurr (v278(VarCurr)<->v279(VarCurr)&v282(VarCurr)).
% 94.06/93.48  all VarCurr (v282(VarCurr)<->v109(VarCurr,bitIndex0)|v109(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (v279(VarCurr)<->v280(VarCurr)|v281(VarCurr)).
% 94.06/93.48  all VarCurr (-v281(VarCurr)<->v109(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (-v280(VarCurr)<->v109(VarCurr,bitIndex0)).
% 94.06/93.48  all VarCurr (v273(VarCurr)<->v274(VarCurr)&v277(VarCurr)).
% 94.06/93.48  all VarCurr (v277(VarCurr)<->v270(VarCurr)|v109(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (v274(VarCurr)<->v275(VarCurr)|v276(VarCurr)).
% 94.06/93.48  all VarCurr (-v276(VarCurr)<->v109(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (-v275(VarCurr)<->v270(VarCurr)).
% 94.06/93.48  all VarCurr (v266(VarCurr)<->v267(VarCurr)&v272(VarCurr)).
% 94.06/93.48  all VarCurr (v272(VarCurr)<->v269(VarCurr)|v109(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (v267(VarCurr)<->v268(VarCurr)|v271(VarCurr)).
% 94.06/93.48  all VarCurr (-v271(VarCurr)<->v109(VarCurr,bitIndex3)).
% 94.06/93.48  all VarCurr (-v268(VarCurr)<->v269(VarCurr)).
% 94.06/93.48  all VarCurr (v269(VarCurr)<->v270(VarCurr)&v109(VarCurr,bitIndex2)).
% 94.06/93.48  all VarCurr (v270(VarCurr)<->v109(VarCurr,bitIndex0)&v109(VarCurr,bitIndex1)).
% 94.06/93.48  all VarCurr (v263(VarCurr)<-> (v109(VarCurr,bitIndex3)<->$F)& (v109(VarCurr,bitIndex2)<->$T)& (v109(VarCurr,bitIndex1)<->$T)& (v109(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  -b0110(bitIndex3).
% 94.06/93.48  b0110(bitIndex2).
% 94.06/93.48  b0110(bitIndex1).
% 94.06/93.48  -b0110(bitIndex0).
% 94.06/93.48  all VarCurr (v260(VarCurr)<-> (v261(VarCurr,bitIndex1)<->$T)& (v261(VarCurr,bitIndex0)<->$F)).
% 94.06/93.48  b10(bitIndex1).
% 94.06/93.48  -b10(bitIndex0).
% 94.06/93.48  all VarCurr (v261(VarCurr,bitIndex0)<->v23(VarCurr)).
% 94.06/93.48  all VarCurr (v261(VarCurr,bitIndex1)<->v113(VarCurr)).
% 94.06/93.48  all VarCurr (-v228(VarCurr)-> (all B (range_31_0(B)-> (v227(VarCurr,B)<->v229(VarCurr,B))))).
% 94.06/93.48  all VarCurr (v228(VarCurr)-> (all B (range_31_0(B)-> (v227(VarCurr,B)<->$F)))).
% 94.06/93.48  all B (range_31_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex31).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex30).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex29).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex28).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex27).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex26).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex25).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex24).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex23).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex22).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex21).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex20).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex19).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex18).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex17).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex16).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex15).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex14).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex13).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex12).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex11).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex10).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex9).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex8).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex7).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex6).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex5).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex4).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex3).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex2).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex1).
% 94.06/93.48  -b00000000000000000000000000000000(bitIndex0).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex5)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex6)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex7)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex8)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex9)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex10)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex11)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex12)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex13)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex14)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex15)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex16)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex17)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex18)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex19)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex20)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex21)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex22)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex23)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex24)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex25)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex26)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex27)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex28)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex29)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex30)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr (v229(VarCurr,bitIndex31)<->v230(VarCurr,bitIndex4)).
% 94.06/93.48  all VarCurr B (range_4_0(B)-> (v229(VarCurr,B)<->v230(VarCurr,B))).
% 94.06/93.48  all VarCurr (v230(VarCurr,bitIndex0)<->v258(VarCurr)).
% 94.06/93.48  all VarCurr (v230(VarCurr,bitIndex1)<->v256(VarCurr)).
% 94.06/93.48  all VarCurr (v230(VarCurr,bitIndex2)<->v252(VarCurr)).
% 94.06/93.48  all VarCurr (v230(VarCurr,bitIndex3)<->v248(VarCurr)).
% 94.06/93.48  all VarCurr (v230(VarCurr,bitIndex4)<->v232(VarCurr)).
% 94.06/93.48  all VarCurr (v256(VarCurr)<->v257(VarCurr)&v259(VarCurr)).
% 94.06/93.48  all VarCurr (v259(VarCurr)<->v236(VarCurr,bitIndex0)|v243(VarCurr)).
% 94.06/93.48  all VarCurr (v257(VarCurr)<->v258(VarCurr)|v236(VarCurr,bitIndex1)).
% 94.06/93.49  all VarCurr (-v258(VarCurr)<->v236(VarCurr,bitIndex0)).
% 94.06/93.49  all VarCurr (v252(VarCurr)<->v253(VarCurr)&v255(VarCurr)).
% 94.06/93.49  all VarCurr (v255(VarCurr)<->v241(VarCurr)|v244(VarCurr)).
% 94.06/93.49  all VarCurr (v253(VarCurr)<->v254(VarCurr)|v236(VarCurr,bitIndex2)).
% 94.06/93.49  all VarCurr (-v254(VarCurr)<->v241(VarCurr)).
% 94.06/93.49  all VarCurr (v248(VarCurr)<->v249(VarCurr)&v251(VarCurr)).
% 94.06/93.49  all VarCurr (v251(VarCurr)<->v239(VarCurr)|v245(VarCurr)).
% 94.06/93.49  all VarCurr (v249(VarCurr)<->v250(VarCurr)|v236(VarCurr,bitIndex3)).
% 94.06/93.49  all VarCurr (-v250(VarCurr)<->v239(VarCurr)).
% 94.06/93.49  all VarCurr (v232(VarCurr)<->v233(VarCurr)&v246(VarCurr)).
% 94.06/93.49  all VarCurr (v246(VarCurr)<->v235(VarCurr)|v247(VarCurr)).
% 94.06/93.49  all VarCurr (-v247(VarCurr)<->v236(VarCurr,bitIndex4)).
% 94.06/93.49  all VarCurr (v233(VarCurr)<->v234(VarCurr)|v236(VarCurr,bitIndex4)).
% 94.06/93.49  all VarCurr (-v234(VarCurr)<->v235(VarCurr)).
% 94.06/93.49  all VarCurr (v235(VarCurr)<->v236(VarCurr,bitIndex3)|v238(VarCurr)).
% 94.06/93.49  all VarCurr (v238(VarCurr)<->v239(VarCurr)&v245(VarCurr)).
% 94.06/93.49  all VarCurr (-v245(VarCurr)<->v236(VarCurr,bitIndex3)).
% 94.06/93.49  all VarCurr (v239(VarCurr)<->v236(VarCurr,bitIndex2)|v240(VarCurr)).
% 94.06/93.49  all VarCurr (v240(VarCurr)<->v241(VarCurr)&v244(VarCurr)).
% 94.06/93.49  all VarCurr (-v244(VarCurr)<->v236(VarCurr,bitIndex2)).
% 94.06/93.49  all VarCurr (v241(VarCurr)<->v236(VarCurr,bitIndex1)|v242(VarCurr)).
% 94.06/93.49  all VarCurr (v242(VarCurr)<->v236(VarCurr,bitIndex0)&v243(VarCurr)).
% 94.06/93.49  all VarCurr (-v243(VarCurr)<->v236(VarCurr,bitIndex1)).
% 94.06/93.49  all VarCurr (-v236(VarCurr,bitIndex4)).
% 94.06/93.49  all VarCurr B (range_3_0(B)-> (v236(VarCurr,B)<->v109(VarCurr,B))).
% 94.06/93.49  all VarCurr (v228(VarCurr)<-> (v109(VarCurr,bitIndex3)<->$F)& (v109(VarCurr,bitIndex2)<->$F)& (v109(VarCurr,bitIndex1)<->$F)& (v109(VarCurr,bitIndex0)<->$F)).
% 94.06/93.49  all VarCurr (v225(VarCurr)<-> (v226(VarCurr,bitIndex1)<->$F)& (v226(VarCurr,bitIndex0)<->$T)).
% 94.06/93.49  -b01(bitIndex1).
% 94.06/93.49  b01(bitIndex0).
% 94.06/93.49  all VarCurr (v226(VarCurr,bitIndex0)<->v23(VarCurr)).
% 94.06/93.49  all VarCurr (v226(VarCurr,bitIndex1)<->v113(VarCurr)).
% 94.06/93.49  all B (range_3_0(B)-> (v109(constB0,B)<->$F)).
% 94.06/93.49  all VarCurr (v223(VarCurr)<-> (v224(VarCurr,bitIndex1)<->$F)& (v224(VarCurr,bitIndex0)<->$F)).
% 94.06/93.49  -b00(bitIndex1).
% 94.06/93.49  -b00(bitIndex0).
% 94.06/93.49  all VarCurr (v224(VarCurr,bitIndex0)<->v23(VarCurr)).
% 94.06/93.49  all VarCurr (v224(VarCurr,bitIndex1)<->v113(VarCurr)).
% 94.06/93.49  all VarCurr (v113(VarCurr)<->v115(VarCurr)).
% 94.06/93.49  all VarCurr (v115(VarCurr)<->v117(VarCurr)).
% 94.06/93.49  all VarCurr (v117(VarCurr)<->v119(VarCurr)).
% 94.06/93.49  all VarCurr (v119(VarCurr)<->v121(VarCurr)).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v200(VarNext)-> (v121(VarNext)<->v121(VarCurr)))).
% 94.06/93.49  all VarNext (v200(VarNext)-> (v121(VarNext)<->v210(VarNext))).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v210(VarNext)<->v208(VarCurr))).
% 94.06/93.49  all VarCurr (-v211(VarCurr)-> (v208(VarCurr)<->v127(VarCurr))).
% 94.06/93.49  all VarCurr (v211(VarCurr)-> (v208(VarCurr)<->$F)).
% 94.06/93.49  all VarCurr (-v211(VarCurr)<->v123(VarCurr)).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v200(VarNext)<->v201(VarNext))).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v201(VarNext)<->v202(VarNext)&v193(VarNext))).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (-v202(VarNext)<->v204(VarNext))).
% 94.06/93.49  all VarNext VarCurr (nextState(VarCurr,VarNext)-> (v204(VarNext)<->v193(VarCurr))).
% 94.06/93.49  all VarCurr (v193(VarCurr)<->v195(VarCurr)).
% 94.06/93.49  all VarCurr (v195(VarCurr)<->v197(VarCurr)).
% 94.06/93.49  all VarCurr (v197(VarCurr)<->v1(VarCurr)).
% 94.06/93.49  all VarCurr (v127(VarCurr)<->v190(VarCurr)&v178(VarCurr)).
% 94.06/93.49  all VarCurr (v190(VarCurr)<->v191(VarCurr)&v139(VarCurr)).
% 94.06/93.49  all VarCurr (-v191(VarCurr)<->v129(VarCurr)).
% 94.06/93.49  all VarCurr (v178(VarCurr)<->v180(VarCurr)).
% 94.06/93.49  all VarCurr (v180(VarCurr)<->v182(VarCurr)).
% 94.06/93.49  all VarCurr (v182(VarCurr)<->v187(VarCurr)|v184(VarCurr,bitIndex2)).
% 94.06/93.49  all VarCurr (v187(VarCurr)<->v184(VarCurr,bitIndex0)|v184(VarCurr,bitIndex1)).
% 94.06/93.49  all B (range_2_0(B)-> (v184(constB0,B)<->$T)).
% 94.06/93.49  all B (range_2_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B).
% 94.06/93.49  b111(bitIndex2).
% 94.06/93.49  b111(bitIndex1).
% 94.06/93.49  b111(bitIndex0).
% 94.06/93.49  all VarCurr (v139(VarCurr)<->v141(VarCurr)).
% 94.06/93.49  all VarCurr (v141(VarCurr)<->v143(VarCurr)).
% 94.06/93.49  all VarCurr (-v167(VarCurr)-> (v143(VarCurr)<->$F)).
% 94.06/93.49  all VarCurr (v167(VarCurr)-> (v143(VarCurr)<->$T)).
% 94.06/93.49  all VarCurr (v167(VarCurr)<->v168(VarCurr)|v176(VarCurr)).
% 94.06/93.49  all VarCurr (v176(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$T)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$T)).
% 94.16/93.57  -b0001001(bitIndex6).
% 94.16/93.57  -b0001001(bitIndex5).
% 94.16/93.57  -b0001001(bitIndex4).
% 94.16/93.57  b0001001(bitIndex3).
% 94.16/93.57  -b0001001(bitIndex2).
% 94.16/93.57  -b0001001(bitIndex1).
% 94.16/93.57  b0001001(bitIndex0).
% 94.16/93.57  all VarCurr (v168(VarCurr)<->v169(VarCurr)|v173(VarCurr)).
% 94.16/93.57  all VarCurr (v173(VarCurr)<->v174(VarCurr)|v175(VarCurr)).
% 94.16/93.57  all VarCurr (v175(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$T)).
% 94.16/93.57  -b0100001(bitIndex6).
% 94.16/93.57  b0100001(bitIndex5).
% 94.16/93.57  -b0100001(bitIndex4).
% 94.16/93.57  -b0100001(bitIndex3).
% 94.16/93.57  -b0100001(bitIndex2).
% 94.16/93.57  -b0100001(bitIndex1).
% 94.16/93.57  b0100001(bitIndex0).
% 94.16/93.57  all VarCurr (v174(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$T)).
% 94.16/93.57  -b0000001(bitIndex6).
% 94.16/93.57  -b0000001(bitIndex5).
% 94.16/93.57  -b0000001(bitIndex4).
% 94.16/93.57  -b0000001(bitIndex3).
% 94.16/93.57  -b0000001(bitIndex2).
% 94.16/93.57  -b0000001(bitIndex1).
% 94.16/93.57  b0000001(bitIndex0).
% 94.16/93.57  all VarCurr (v169(VarCurr)<->v145(VarCurr,bitIndex0)&v170(VarCurr)).
% 94.16/93.57  all VarCurr (v170(VarCurr)<->v171(VarCurr)|v172(VarCurr)).
% 94.16/93.57  all VarCurr (v172(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$T)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 94.16/93.57  -b0100000(bitIndex6).
% 94.16/93.57  b0100000(bitIndex5).
% 94.16/93.57  -b0100000(bitIndex4).
% 94.16/93.57  -b0100000(bitIndex3).
% 94.16/93.57  -b0100000(bitIndex2).
% 94.16/93.57  -b0100000(bitIndex1).
% 94.16/93.57  -b0100000(bitIndex0).
% 94.16/93.57  all VarCurr (v171(VarCurr)<-> (v158(VarCurr,bitIndex6)<->$F)& (v158(VarCurr,bitIndex5)<->$F)& (v158(VarCurr,bitIndex4)<->$F)& (v158(VarCurr,bitIndex3)<->$F)& (v158(VarCurr,bitIndex2)<->$F)& (v158(VarCurr,bitIndex1)<->$F)& (v158(VarCurr,bitIndex0)<->$F)).
% 94.16/93.57  -b0000000(bitIndex6).
% 94.16/93.57  -b0000000(bitIndex5).
% 94.16/93.57  -b0000000(bitIndex4).
% 94.16/93.57  -b0000000(bitIndex3).
% 94.16/93.57  -b0000000(bitIndex2).
% 94.16/93.57  -b0000000(bitIndex1).
% 94.16/93.57  -b0000000(bitIndex0).
% 94.16/93.57  all VarCurr B (range_6_0(B)-> (v158(VarCurr,B)<->v160(VarCurr,B))).
% 94.16/93.57  all B (range_6_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B).
% 94.16/93.57  all VarCurr ((v160(VarCurr,bitIndex6)<->v149(VarCurr,bitIndex60))& (v160(VarCurr,bitIndex5)<->v149(VarCurr,bitIndex59))& (v160(VarCurr,bitIndex4)<->v149(VarCurr,bitIndex58))& (v160(VarCurr,bitIndex3)<->v149(VarCurr,bitIndex57))& (v160(VarCurr,bitIndex2)<->v149(VarCurr,bitIndex56))& (v160(VarCurr,bitIndex1)<->v149(VarCurr,bitIndex55))& (v160(VarCurr,bitIndex0)<->v149(VarCurr,bitIndex54))).
% 94.16/93.57  all VarCurr B (range_60_54(B)-> (v149(VarCurr,B)<->v151(VarCurr,B))).
% 94.16/93.57  all VarCurr B (range_60_54(B)-> (v151(VarCurr,B)<->v156(VarCurr,B))).
% 94.16/93.57  all B (range_60_54(B)<->bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B).
% 94.16/93.57  all VarCurr (v145(VarCurr,bitIndex0)<->v147(VarCurr,bitIndex0)).
% 94.16/93.57  all VarCurr (v147(VarCurr,bitIndex0)<->v149(VarCurr,bitIndex12)).
% 94.16/93.57  all VarCurr (v149(VarCurr,bitIndex12)<->v151(VarCurr,bitIndex12)).
% 94.16/93.57  all VarCurr (v151(VarCurr,bitIndex12)<->v156(VarCurr,bitIndex12)).
% 94.16/93.57  all B (range_3_0(B)-> (v155(constB0,B)<->$F)).
% 94.16/93.57  all B (range_3_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B).
% 94.16/93.57  -b0000(bitIndex3).
% 94.16/93.57  -b0000(bitIndex2).
% 94.16/93.57  -b0000(bitIndex1).
% 94.16/93.57  -b0000(bitIndex0).
% 94.16/93.57  all VarCurr (v129(VarCurr)<->v131(VarCurr)).
% 94.16/93.57  all VarCurr (v131(VarCurr)<->v133(VarCurr)).
% 94.16/93.57  all VarCurr (v133(VarCurr)<-> (v135(VarCurr,bitIndex4)<->$F)& (v135(VarCurr,bitIndex3)<->$F)& (v135(VarCurr,bitIndex2)<->$F)& (v135(VarCurr,bitIndex1)<->$F)& (v135(VarCurr,bitIndex0)<->$F)).
% 94.16/93.57  all B (range_4_0(B)-> (v135(constB0,B)<->$F)).
% 94.16/93.57  all B (range_4_0(B)<->bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B).
% 94.16/93.57  -b00000(bitIndex4).
% 94.16/93.57  -b00000(bitIndex3).
% 94.16/93.57  -b00000(bitIndex2).
% 94.16/93.57  -b00000(bitIndex1).
% 94.16/93.57  -b00000(bitIndex0).
% 94.16/93.57  all VarCurr (v123(VarCurr)<->v125(VarCurr)).
% 94.16/93.57  all VarCurr (v125(VarCurr)<->v14(VarCurr)).
% 94.16/93.57  all VarCurr (v58(VarCurr)<->v60(VarCurr)).
% 94.16/93.57  all VarCurr (v60(VarCurr)<->v62(VarCurr)).
% 94.16/93.57  all VarCurr (v62(VarCurr)<->v64(VarCurr)).
% 94.16/93.57  all VarCurr (v64(VarCurr)<->v16(VarCurr)).
% 94.16/93.57  all VarCurr (v33(VarCurr)<->v12(VarCurr)).
% 94.16/93.57  all VarCurr (v10(VarCurr)<->v12(VarCurr)).
% 94.16/93.57  all VarCurr (v12(VarCurr)<->v14(VarCurr)).
% 94.16/93.57  all VarCurr (v14(VarCurr)<->v16(VarCurr)).
% 94.16/93.57  all VarCurr (v16(VarCurr)<->v18(VarCurr)).
% 94.16/93.57  end_of_list.
% 94.16/93.57  
% 94.16/93.57  -------> usable clausifies to:
% 94.16/93.57  
% 94.16/93.57  list(usable).
% 94.16/93.57  0 [] A=A.
% 94.16/93.57  0 [] nextState(constB8,constB9).
% 94.16/93.57  0 [] nextState(constB7,constB8).
% 94.16/93.57  0 [] nextState(constB6,constB7).
% 94.16/93.57  0 [] nextState(constB5,constB6).
% 94.16/93.57  0 [] nextState(constB4,constB5).
% 94.16/93.57  0 [] nextState(constB3,constB4).
% 94.16/93.57  0 [] nextState(constB2,constB3).
% 94.16/93.57  0 [] nextState(constB1,constB2).
% 94.16/93.57  0 [] nextState(constB0,constB1).
% 94.16/93.57  0 [] -nextState(VarCurr,VarNext)|reachableState(VarCurr).
% 94.16/93.57  0 [] -nextState(VarCurr,VarNext)|reachableState(VarNext).
% 94.16/93.57  0 [] -reachableState(VarState)|constB0=VarState|constB1=VarState|constB2=VarState|constB3=VarState|constB4=VarState|constB5=VarState|constB6=VarState|constB7=VarState|constB8=VarState|constB9=VarState|constB10=VarState|constB11=VarState|constB12=VarState|constB13=VarState|constB14=VarState|constB15=VarState|constB16=VarState|constB17=VarState|constB18=VarState|constB19=VarState|constB20=VarState.
% 94.16/93.57  0 [] reachableState(constB20).
% 94.16/93.57  0 [] reachableState(constB19).
% 94.16/93.57  0 [] reachableState(constB18).
% 94.16/93.57  0 [] reachableState(constB17).
% 94.16/93.57  0 [] reachableState(constB16).
% 94.16/93.57  0 [] reachableState(constB15).
% 94.16/93.57  0 [] reachableState(constB14).
% 94.16/93.57  0 [] reachableState(constB13).
% 94.16/93.57  0 [] reachableState(constB12).
% 94.16/93.57  0 [] reachableState(constB11).
% 94.16/93.57  0 [] reachableState(constB10).
% 94.16/93.57  0 [] reachableState(constB9).
% 94.16/93.57  0 [] reachableState(constB8).
% 94.16/93.57  0 [] reachableState(constB7).
% 94.16/93.57  0 [] reachableState(constB6).
% 94.16/93.57  0 [] reachableState(constB5).
% 94.16/93.57  0 [] reachableState(constB4).
% 94.16/93.57  0 [] reachableState(constB3).
% 94.16/93.57  0 [] reachableState(constB2).
% 94.16/93.57  0 [] reachableState(constB1).
% 94.16/93.57  0 [] reachableState(constB0).
% 94.16/93.57  0 [] -nextState(VarCurr,VarNext)| -v1(VarCurr)| -v1(VarNext).
% 94.16/93.57  0 [] -nextState(VarCurr,VarNext)|v1(VarCurr)|v1(VarNext).
% 94.16/93.57  0 [] -v1(constB0).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_20,B)|v1019(constB20,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_20,B)| -v1019(constB20,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_20).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB20,v1019_range_3_to_0_address_term_bound_20).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_19,B)|v1019(constB19,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_19,B)| -v1019(constB19,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_19).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB19,v1019_range_3_to_0_address_term_bound_19).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_18,B)|v1019(constB18,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_18,B)| -v1019(constB18,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_18).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB18,v1019_range_3_to_0_address_term_bound_18).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_17,B)|v1019(constB17,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_17,B)| -v1019(constB17,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_17).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB17,v1019_range_3_to_0_address_term_bound_17).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_16,B)|v1019(constB16,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_16,B)| -v1019(constB16,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_16).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB16,v1019_range_3_to_0_address_term_bound_16).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_15,B)|v1019(constB15,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_15,B)| -v1019(constB15,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_15).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB15,v1019_range_3_to_0_address_term_bound_15).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_14,B)|v1019(constB14,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_14,B)| -v1019(constB14,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_14).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB14,v1019_range_3_to_0_address_term_bound_14).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_13,B)|v1019(constB13,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_13,B)| -v1019(constB13,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_13).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB13,v1019_range_3_to_0_address_term_bound_13).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_12,B)|v1019(constB12,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_12,B)| -v1019(constB12,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_12).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB12,v1019_range_3_to_0_address_term_bound_12).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_11,B)|v1019(constB11,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_11,B)| -v1019(constB11,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_11).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB11,v1019_range_3_to_0_address_term_bound_11).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_10,B)|v1019(constB10,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_10,B)| -v1019(constB10,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_10).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB10,v1019_range_3_to_0_address_term_bound_10).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_9,B)|v1019(constB9,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_9,B)| -v1019(constB9,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_9).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB9,v1019_range_3_to_0_address_term_bound_9).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_8,B)|v1019(constB8,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_8,B)| -v1019(constB8,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_8).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB8,v1019_range_3_to_0_address_term_bound_8).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_7,B)|v1019(constB7,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_7,B)| -v1019(constB7,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_7).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB7,v1019_range_3_to_0_address_term_bound_7).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_6,B)|v1019(constB6,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_6,B)| -v1019(constB6,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_6).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB6,v1019_range_3_to_0_address_term_bound_6).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_5,B)|v1019(constB5,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_5,B)| -v1019(constB5,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_5).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB5,v1019_range_3_to_0_address_term_bound_5).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_4,B)|v1019(constB4,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_4,B)| -v1019(constB4,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_4).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB4,v1019_range_3_to_0_address_term_bound_4).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_3,B)|v1019(constB3,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_3,B)| -v1019(constB3,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_3).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB3,v1019_range_3_to_0_address_term_bound_3).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_2,B)|v1019(constB2,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_2,B)| -v1019(constB2,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_2).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB2,v1019_range_3_to_0_address_term_bound_2).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_1,B)|v1019(constB1,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_1,B)| -v1019(constB1,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_1).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB1,v1019_range_3_to_0_address_term_bound_1).
% 94.16/93.57  0 [] -addressVal(v1019_range_3_to_0_address_term_bound_0,B)|v1019(constB0,B).
% 94.16/93.57  0 [] addressVal(v1019_range_3_to_0_address_term_bound_0,B)| -v1019(constB0,B).
% 94.16/93.57  0 [] address(v1019_range_3_to_0_address_term_bound_0).
% 94.16/93.57  0 [] v1019_range_3_to_0_address_association(constB0,v1019_range_3_to_0_address_term_bound_0).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_20,B)|v953(constB20,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_20,B)| -v953(constB20,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_20).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB20,v953_range_3_to_0_address_term_bound_20).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_19,B)|v953(constB19,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_19,B)| -v953(constB19,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_19).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB19,v953_range_3_to_0_address_term_bound_19).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_18,B)|v953(constB18,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_18,B)| -v953(constB18,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_18).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB18,v953_range_3_to_0_address_term_bound_18).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_17,B)|v953(constB17,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_17,B)| -v953(constB17,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_17).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB17,v953_range_3_to_0_address_term_bound_17).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_16,B)|v953(constB16,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_16,B)| -v953(constB16,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_16).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB16,v953_range_3_to_0_address_term_bound_16).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_15,B)|v953(constB15,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_15,B)| -v953(constB15,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_15).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB15,v953_range_3_to_0_address_term_bound_15).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_14,B)|v953(constB14,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_14,B)| -v953(constB14,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_14).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB14,v953_range_3_to_0_address_term_bound_14).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_13,B)|v953(constB13,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_13,B)| -v953(constB13,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_13).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB13,v953_range_3_to_0_address_term_bound_13).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_12,B)|v953(constB12,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_12,B)| -v953(constB12,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_12).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB12,v953_range_3_to_0_address_term_bound_12).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_11,B)|v953(constB11,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_11,B)| -v953(constB11,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_11).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB11,v953_range_3_to_0_address_term_bound_11).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_10,B)|v953(constB10,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_10,B)| -v953(constB10,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_10).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB10,v953_range_3_to_0_address_term_bound_10).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_9,B)|v953(constB9,B).
% 94.16/93.57  0 [] addressVal(v953_range_3_to_0_address_term_bound_9,B)| -v953(constB9,B).
% 94.16/93.57  0 [] address(v953_range_3_to_0_address_term_bound_9).
% 94.16/93.57  0 [] v953_range_3_to_0_address_association(constB9,v953_range_3_to_0_address_term_bound_9).
% 94.16/93.57  0 [] -addressVal(v953_range_3_to_0_address_term_bound_8,B)|v953(constB8,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_8,B)| -v953(constB8,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB8,v953_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_7,B)|v953(constB7,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_7,B)| -v953(constB7,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB7,v953_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_6,B)|v953(constB6,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_6,B)| -v953(constB6,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB6,v953_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_5,B)|v953(constB5,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_5,B)| -v953(constB5,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB5,v953_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_4,B)|v953(constB4,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_4,B)| -v953(constB4,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB4,v953_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_3,B)|v953(constB3,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_3,B)| -v953(constB3,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB3,v953_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_2,B)|v953(constB2,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_2,B)| -v953(constB2,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_2).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB2,v953_range_3_to_0_address_term_bound_2).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_1,B)|v953(constB1,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_1,B)| -v953(constB1,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_1).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB1,v953_range_3_to_0_address_term_bound_1).
% 94.16/93.58  0 [] -addressVal(v953_range_3_to_0_address_term_bound_0,B)|v953(constB0,B).
% 94.16/93.58  0 [] addressVal(v953_range_3_to_0_address_term_bound_0,B)| -v953(constB0,B).
% 94.16/93.58  0 [] address(v953_range_3_to_0_address_term_bound_0).
% 94.16/93.58  0 [] v953_range_3_to_0_address_association(constB0,v953_range_3_to_0_address_term_bound_0).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_20,B)|v869(constB20,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_20,B)| -v869(constB20,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_20).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB20,v869_range_3_to_0_address_term_bound_20).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_19,B)|v869(constB19,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_19,B)| -v869(constB19,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_19).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB19,v869_range_3_to_0_address_term_bound_19).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_18,B)|v869(constB18,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_18,B)| -v869(constB18,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_18).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB18,v869_range_3_to_0_address_term_bound_18).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_17,B)|v869(constB17,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_17,B)| -v869(constB17,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_17).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB17,v869_range_3_to_0_address_term_bound_17).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_16,B)|v869(constB16,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_16,B)| -v869(constB16,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_16).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB16,v869_range_3_to_0_address_term_bound_16).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_15,B)|v869(constB15,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_15,B)| -v869(constB15,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_15).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB15,v869_range_3_to_0_address_term_bound_15).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_14,B)|v869(constB14,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_14,B)| -v869(constB14,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_14).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB14,v869_range_3_to_0_address_term_bound_14).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_13,B)|v869(constB13,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_13,B)| -v869(constB13,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_13).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB13,v869_range_3_to_0_address_term_bound_13).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_12,B)|v869(constB12,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_12,B)| -v869(constB12,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_12).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB12,v869_range_3_to_0_address_term_bound_12).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_11,B)|v869(constB11,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_11,B)| -v869(constB11,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_11).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB11,v869_range_3_to_0_address_term_bound_11).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_10,B)|v869(constB10,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_10,B)| -v869(constB10,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_10).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB10,v869_range_3_to_0_address_term_bound_10).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_9,B)|v869(constB9,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_9,B)| -v869(constB9,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_9).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB9,v869_range_3_to_0_address_term_bound_9).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_8,B)|v869(constB8,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_8,B)| -v869(constB8,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB8,v869_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_7,B)|v869(constB7,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_7,B)| -v869(constB7,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB7,v869_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_6,B)|v869(constB6,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_6,B)| -v869(constB6,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB6,v869_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_5,B)|v869(constB5,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_5,B)| -v869(constB5,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB5,v869_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_4,B)|v869(constB4,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_4,B)| -v869(constB4,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB4,v869_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_3,B)|v869(constB3,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_3,B)| -v869(constB3,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB3,v869_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_2,B)|v869(constB2,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_2,B)| -v869(constB2,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_2).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB2,v869_range_3_to_0_address_term_bound_2).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_1,B)|v869(constB1,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_1,B)| -v869(constB1,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_1).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB1,v869_range_3_to_0_address_term_bound_1).
% 94.16/93.58  0 [] -addressVal(v869_range_3_to_0_address_term_bound_0,B)|v869(constB0,B).
% 94.16/93.58  0 [] addressVal(v869_range_3_to_0_address_term_bound_0,B)| -v869(constB0,B).
% 94.16/93.58  0 [] address(v869_range_3_to_0_address_term_bound_0).
% 94.16/93.58  0 [] v869_range_3_to_0_address_association(constB0,v869_range_3_to_0_address_term_bound_0).
% 94.16/93.58  0 [] address(b1110_address_term).
% 94.16/93.58  0 [] -addressVal(b1110_address_term,B)|b1110(B).
% 94.16/93.58  0 [] addressVal(b1110_address_term,B)| -b1110(B).
% 94.16/93.58  0 [] address(b1101_address_term).
% 94.16/93.58  0 [] -addressVal(b1101_address_term,B)|b1101(B).
% 94.16/93.58  0 [] addressVal(b1101_address_term,B)| -b1101(B).
% 94.16/93.58  0 [] address(b1100_address_term).
% 94.16/93.58  0 [] -addressVal(b1100_address_term,B)|b1100(B).
% 94.16/93.58  0 [] addressVal(b1100_address_term,B)| -b1100(B).
% 94.16/93.58  0 [] address(b1011_address_term).
% 94.16/93.58  0 [] -addressVal(b1011_address_term,B)|b1011(B).
% 94.16/93.58  0 [] addressVal(b1011_address_term,B)| -b1011(B).
% 94.16/93.58  0 [] address(b1010_address_term).
% 94.16/93.58  0 [] -addressVal(b1010_address_term,B)|b1010(B).
% 94.16/93.58  0 [] addressVal(b1010_address_term,B)| -b1010(B).
% 94.16/93.58  0 [] address(b1001_address_term).
% 94.16/93.58  0 [] -addressVal(b1001_address_term,B)|b1001(B).
% 94.16/93.58  0 [] addressVal(b1001_address_term,B)| -b1001(B).
% 94.16/93.58  0 [] address(b1000_address_term).
% 94.16/93.58  0 [] -addressVal(b1000_address_term,B)|b1000(B).
% 94.16/93.58  0 [] addressVal(b1000_address_term,B)| -b1000(B).
% 94.16/93.58  0 [] address(b0111_address_term).
% 94.16/93.58  0 [] -addressVal(b0111_address_term,B)|b0111(B).
% 94.16/93.58  0 [] addressVal(b0111_address_term,B)| -b0111(B).
% 94.16/93.58  0 [] address(b0100_address_term).
% 94.16/93.58  0 [] -addressVal(b0100_address_term,B)|b0100(B).
% 94.16/93.58  0 [] addressVal(b0100_address_term,B)| -b0100(B).
% 94.16/93.58  0 [] address(b0011_address_term).
% 94.16/93.58  0 [] -addressVal(b0011_address_term,B)|b0011(B).
% 94.16/93.58  0 [] addressVal(b0011_address_term,B)| -b0011(B).
% 94.16/93.58  0 [] address(b0010_address_term).
% 94.16/93.58  0 [] -addressVal(b0010_address_term,B)|b0010(B).
% 94.16/93.58  0 [] addressVal(b0010_address_term,B)| -b0010(B).
% 94.16/93.58  0 [] address(b1111_address_term).
% 94.16/93.58  0 [] -addressVal(b1111_address_term,B)|b1111(B).
% 94.16/93.58  0 [] addressVal(b1111_address_term,B)| -b1111(B).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_20,B)|v791(constB20,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_20,B)| -v791(constB20,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_20).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB20,v791_range_3_to_0_address_term_bound_20).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_19,B)|v791(constB19,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_19,B)| -v791(constB19,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_19).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB19,v791_range_3_to_0_address_term_bound_19).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_18,B)|v791(constB18,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_18,B)| -v791(constB18,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_18).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB18,v791_range_3_to_0_address_term_bound_18).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_17,B)|v791(constB17,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_17,B)| -v791(constB17,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_17).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB17,v791_range_3_to_0_address_term_bound_17).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_16,B)|v791(constB16,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_16,B)| -v791(constB16,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_16).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB16,v791_range_3_to_0_address_term_bound_16).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_15,B)|v791(constB15,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_15,B)| -v791(constB15,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_15).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB15,v791_range_3_to_0_address_term_bound_15).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_14,B)|v791(constB14,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_14,B)| -v791(constB14,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_14).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB14,v791_range_3_to_0_address_term_bound_14).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_13,B)|v791(constB13,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_13,B)| -v791(constB13,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_13).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB13,v791_range_3_to_0_address_term_bound_13).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_12,B)|v791(constB12,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_12,B)| -v791(constB12,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_12).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB12,v791_range_3_to_0_address_term_bound_12).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_11,B)|v791(constB11,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_11,B)| -v791(constB11,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_11).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB11,v791_range_3_to_0_address_term_bound_11).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_10,B)|v791(constB10,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_10,B)| -v791(constB10,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_10).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB10,v791_range_3_to_0_address_term_bound_10).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_9,B)|v791(constB9,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_9,B)| -v791(constB9,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_9).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB9,v791_range_3_to_0_address_term_bound_9).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_8,B)|v791(constB8,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_8,B)| -v791(constB8,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB8,v791_range_3_to_0_address_term_bound_8).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_7,B)|v791(constB7,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_7,B)| -v791(constB7,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB7,v791_range_3_to_0_address_term_bound_7).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_6,B)|v791(constB6,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_6,B)| -v791(constB6,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB6,v791_range_3_to_0_address_term_bound_6).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_5,B)|v791(constB5,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_5,B)| -v791(constB5,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB5,v791_range_3_to_0_address_term_bound_5).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_4,B)|v791(constB4,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_4,B)| -v791(constB4,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB4,v791_range_3_to_0_address_term_bound_4).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_3,B)|v791(constB3,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_3,B)| -v791(constB3,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB3,v791_range_3_to_0_address_term_bound_3).
% 94.16/93.58  0 [] -addressVal(v791_range_3_to_0_address_term_bound_2,B)|v791(constB2,B).
% 94.16/93.58  0 [] addressVal(v791_range_3_to_0_address_term_bound_2,B)| -v791(constB2,B).
% 94.16/93.58  0 [] address(v791_range_3_to_0_address_term_bound_2).
% 94.16/93.58  0 [] v791_range_3_to_0_address_association(constB2,v791_range_3_to_0_address_term_bound_2).
% 94.22/93.59  0 [] -addressVal(v791_range_3_to_0_address_term_bound_1,B)|v791(constB1,B).
% 94.22/93.59  0 [] addressVal(v791_range_3_to_0_address_term_bound_1,B)| -v791(constB1,B).
% 94.22/93.59  0 [] address(v791_range_3_to_0_address_term_bound_1).
% 94.22/93.59  0 [] v791_range_3_to_0_address_association(constB1,v791_range_3_to_0_address_term_bound_1).
% 94.22/93.59  0 [] -addressVal(v791_range_3_to_0_address_term_bound_0,B)|v791(constB0,B).
% 94.22/93.59  0 [] addressVal(v791_range_3_to_0_address_term_bound_0,B)| -v791(constB0,B).
% 94.22/93.59  0 [] address(v791_range_3_to_0_address_term_bound_0).
% 94.22/93.59  0 [] v791_range_3_to_0_address_association(constB0,v791_range_3_to_0_address_term_bound_0).
% 94.22/93.59  0 [] address(b0101_address_term).
% 94.22/93.59  0 [] -addressVal(b0101_address_term,B)|b0101(B).
% 94.22/93.59  0 [] addressVal(b0101_address_term,B)| -b0101(B).
% 94.22/93.59  0 [] address(b0001_address_term).
% 94.22/93.59  0 [] -addressVal(b0001_address_term,B)|b0001(B).
% 94.22/93.59  0 [] addressVal(b0001_address_term,B)| -b0001(B).
% 94.22/93.59  0 [] address(b0110_address_term).
% 94.22/93.59  0 [] -addressVal(b0110_address_term,B)|b0110(B).
% 94.22/93.59  0 [] addressVal(b0110_address_term,B)| -b0110(B).
% 94.22/93.59  0 [] address(b0000_address_term).
% 94.22/93.59  0 [] -addressVal(b0000_address_term,B)|b0000(B).
% 94.22/93.59  0 [] addressVal(b0000_address_term,B)| -b0000(B).
% 94.22/93.59  0 [] -address(A1)| -address(A2)| -addressDiff(A1,A2,B)|A1=A2| -addressVal(A1,B)| -addressVal(A2,B).
% 94.22/93.59  0 [] -address(A1)| -address(A2)| -addressDiff(A1,A2,B)|A1=A2|addressVal(A1,B)|addressVal(A2,B).
% 94.22/93.59  0 [] addressDiff(A1,A2,bitIndex0)|addressDiff(A1,A2,bitIndex1)|addressDiff(A1,A2,bitIndex2)|addressDiff(A1,A2,bitIndex3).
% 94.22/93.59  0 [] reachableState($c1).
% 94.22/93.59  0 [] -v4($c1).
% 94.22/93.59  0 [] v4(VarCurr)|v3674(VarCurr).
% 94.22/93.59  0 [] -v4(VarCurr)| -v3674(VarCurr).
% 94.22/93.59  0 [] v3674(VarCurr)|v3675(VarCurr).
% 94.22/93.59  0 [] -v3674(VarCurr)| -v3675(VarCurr).
% 94.22/93.59  0 [] -v3675(VarCurr)|v3677(VarCurr).
% 94.22/93.59  0 [] -v3675(VarCurr)|v3693(VarCurr).
% 94.22/93.59  0 [] v3675(VarCurr)| -v3677(VarCurr)| -v3693(VarCurr).
% 94.22/93.59  0 [] -v3693(VarCurr)|v3679(VarCurr,bitIndex0)|v3679(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3693(VarCurr)| -v3679(VarCurr,bitIndex0).
% 94.22/93.59  0 [] v3693(VarCurr)| -v3679(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3677(VarCurr)|v3678(VarCurr).
% 94.22/93.59  0 [] -v3677(VarCurr)| -v3678(VarCurr).
% 94.22/93.59  0 [] -v3678(VarCurr)|v3679(VarCurr,bitIndex0).
% 94.22/93.59  0 [] -v3678(VarCurr)|v3679(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3678(VarCurr)| -v3679(VarCurr,bitIndex0)| -v3679(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3679(VarCurr,bitIndex0)|v3680(VarCurr).
% 94.22/93.59  0 [] v3679(VarCurr,bitIndex0)| -v3680(VarCurr).
% 94.22/93.59  0 [] -v3679(VarCurr,bitIndex1)|$T.
% 94.22/93.59  0 [] v3679(VarCurr,bitIndex1)| -$T.
% 94.22/93.59  0 [] -v3680(VarCurr)|v3682(VarCurr).
% 94.22/93.59  0 [] -v3680(VarCurr)|v3684(VarCurr,bitIndex5).
% 94.22/93.59  0 [] v3680(VarCurr)| -v3682(VarCurr)| -v3684(VarCurr,bitIndex5).
% 94.22/93.59  0 [] -v3682(VarCurr)|v3683(VarCurr).
% 94.22/93.59  0 [] -v3682(VarCurr)|v3684(VarCurr,bitIndex4).
% 94.22/93.59  0 [] v3682(VarCurr)| -v3683(VarCurr)| -v3684(VarCurr,bitIndex4).
% 94.22/93.59  0 [] -v3683(VarCurr)|v3684(VarCurr,bitIndex3)|v3685(VarCurr).
% 94.22/93.59  0 [] v3683(VarCurr)| -v3684(VarCurr,bitIndex3).
% 94.22/93.59  0 [] v3683(VarCurr)| -v3685(VarCurr).
% 94.22/93.59  0 [] -v3685(VarCurr)|v3686(VarCurr).
% 94.22/93.59  0 [] -v3685(VarCurr)|v3692(VarCurr).
% 94.22/93.59  0 [] v3685(VarCurr)| -v3686(VarCurr)| -v3692(VarCurr).
% 94.22/93.59  0 [] v3692(VarCurr)|v3684(VarCurr,bitIndex3).
% 94.22/93.59  0 [] -v3692(VarCurr)| -v3684(VarCurr,bitIndex3).
% 94.22/93.59  0 [] -v3686(VarCurr)|v3684(VarCurr,bitIndex2)|v3687(VarCurr).
% 94.22/93.59  0 [] v3686(VarCurr)| -v3684(VarCurr,bitIndex2).
% 94.22/93.59  0 [] v3686(VarCurr)| -v3687(VarCurr).
% 94.22/93.59  0 [] -v3687(VarCurr)|v3688(VarCurr).
% 94.22/93.59  0 [] -v3687(VarCurr)|v3691(VarCurr).
% 94.22/93.59  0 [] v3687(VarCurr)| -v3688(VarCurr)| -v3691(VarCurr).
% 94.22/93.59  0 [] v3691(VarCurr)|v3684(VarCurr,bitIndex2).
% 94.22/93.59  0 [] -v3691(VarCurr)| -v3684(VarCurr,bitIndex2).
% 94.22/93.59  0 [] -v3688(VarCurr)|v3684(VarCurr,bitIndex1)|v3689(VarCurr).
% 94.22/93.59  0 [] v3688(VarCurr)| -v3684(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3688(VarCurr)| -v3689(VarCurr).
% 94.22/93.59  0 [] -v3689(VarCurr)|v3684(VarCurr,bitIndex0).
% 94.22/93.59  0 [] -v3689(VarCurr)|v3690(VarCurr).
% 94.22/93.59  0 [] v3689(VarCurr)| -v3684(VarCurr,bitIndex0)| -v3690(VarCurr).
% 94.22/93.59  0 [] v3690(VarCurr)|v3684(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3690(VarCurr)| -v3684(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3684(VarCurr,bitIndex3).
% 94.22/93.59  0 [] -v3684(VarCurr,bitIndex4).
% 94.22/93.59  0 [] -v3684(VarCurr,bitIndex5).
% 94.22/93.59  0 [] -range_2_0(B)| -v3684(VarCurr,B)|v8(VarCurr,B).
% 94.22/93.59  0 [] -range_2_0(B)|v3684(VarCurr,B)| -v8(VarCurr,B).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)|v3660(VarNext)| -range_2_0(B)| -v8(VarNext,B)|v8(VarCurr,B).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)|v3660(VarNext)| -range_2_0(B)|v8(VarNext,B)| -v8(VarCurr,B).
% 94.22/93.59  0 [] -v3660(VarNext)| -range_2_0(B)| -v8(VarNext,B)|v3668(VarNext,B).
% 94.22/93.59  0 [] -v3660(VarNext)| -range_2_0(B)|v8(VarNext,B)| -v3668(VarNext,B).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -range_2_0(B)| -v3668(VarNext,B)|v3666(VarCurr,B).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -range_2_0(B)|v3668(VarNext,B)| -v3666(VarCurr,B).
% 94.22/93.59  0 [] v3669(VarCurr)| -range_2_0(B)| -v3666(VarCurr,B)|v21(VarCurr,B).
% 94.22/93.59  0 [] v3669(VarCurr)| -range_2_0(B)|v3666(VarCurr,B)| -v21(VarCurr,B).
% 94.22/93.59  0 [] -v3669(VarCurr)| -range_2_0(B)| -v3666(VarCurr,B)|$F.
% 94.22/93.59  0 [] -v3669(VarCurr)| -range_2_0(B)|v3666(VarCurr,B)| -$F.
% 94.22/93.59  0 [] v3669(VarCurr)|v10(VarCurr).
% 94.22/93.59  0 [] -v3669(VarCurr)| -v10(VarCurr).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -v3660(VarNext)|v3661(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)|v3660(VarNext)| -v3661(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -v3661(VarNext)|v3662(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -v3661(VarNext)|v286(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)|v3661(VarNext)| -v3662(VarNext)| -v286(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)|v3662(VarNext)|v295(VarNext).
% 94.22/93.59  0 [] -nextState(VarCurr,VarNext)| -v3662(VarNext)| -v295(VarNext).
% 94.22/93.59  0 [] v23(VarCurr)| -range_2_0(B)| -v21(VarCurr,B)|v8(VarCurr,B).
% 94.22/93.59  0 [] v23(VarCurr)| -range_2_0(B)|v21(VarCurr,B)| -v8(VarCurr,B).
% 94.22/93.59  0 [] -v23(VarCurr)| -range_2_0(B)| -v21(VarCurr,B)|v3643(VarCurr,B).
% 94.22/93.59  0 [] -v23(VarCurr)| -range_2_0(B)|v21(VarCurr,B)| -v3643(VarCurr,B).
% 94.22/93.59  0 [] v3644(VarCurr)| -range_2_0(B)| -v3643(VarCurr,B)|v3645(VarCurr,B).
% 94.22/93.59  0 [] v3644(VarCurr)| -range_2_0(B)|v3643(VarCurr,B)| -v3645(VarCurr,B).
% 94.22/93.59  0 [] -v3644(VarCurr)| -range_2_0(B)| -v3643(VarCurr,B)|$F.
% 94.22/93.59  0 [] -v3644(VarCurr)| -range_2_0(B)|v3643(VarCurr,B)| -$F.
% 94.22/93.59  0 [] -b000(bitIndex2).
% 94.22/93.59  0 [] -b000(bitIndex1).
% 94.22/93.59  0 [] -b000(bitIndex0).
% 94.22/93.59  0 [] -v3645(VarCurr,bitIndex0)|v3655(VarCurr).
% 94.22/93.59  0 [] v3645(VarCurr,bitIndex0)| -v3655(VarCurr).
% 94.22/93.59  0 [] -v3645(VarCurr,bitIndex1)|v3653(VarCurr).
% 94.22/93.59  0 [] v3645(VarCurr,bitIndex1)| -v3653(VarCurr).
% 94.22/93.59  0 [] -v3645(VarCurr,bitIndex2)|v3647(VarCurr).
% 94.22/93.59  0 [] v3645(VarCurr,bitIndex2)| -v3647(VarCurr).
% 94.22/93.59  0 [] -v3653(VarCurr)|v3654(VarCurr).
% 94.22/93.59  0 [] -v3653(VarCurr)|v3657(VarCurr).
% 94.22/93.59  0 [] v3653(VarCurr)| -v3654(VarCurr)| -v3657(VarCurr).
% 94.22/93.59  0 [] -v3657(VarCurr)|v8(VarCurr,bitIndex0)|v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3657(VarCurr)| -v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] v3657(VarCurr)| -v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3654(VarCurr)|v3655(VarCurr)|v3656(VarCurr).
% 94.22/93.59  0 [] v3654(VarCurr)| -v3655(VarCurr).
% 94.22/93.59  0 [] v3654(VarCurr)| -v3656(VarCurr).
% 94.22/93.59  0 [] v3656(VarCurr)|v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3656(VarCurr)| -v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3655(VarCurr)|v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] -v3655(VarCurr)| -v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] -v3647(VarCurr)|v3648(VarCurr).
% 94.22/93.59  0 [] -v3647(VarCurr)|v3652(VarCurr).
% 94.22/93.59  0 [] v3647(VarCurr)| -v3648(VarCurr)| -v3652(VarCurr).
% 94.22/93.59  0 [] -v3652(VarCurr)|v3650(VarCurr)|v8(VarCurr,bitIndex2).
% 94.22/93.59  0 [] v3652(VarCurr)| -v3650(VarCurr).
% 94.22/93.59  0 [] v3652(VarCurr)| -v8(VarCurr,bitIndex2).
% 94.22/93.59  0 [] -v3648(VarCurr)|v3649(VarCurr)|v3651(VarCurr).
% 94.22/93.59  0 [] v3648(VarCurr)| -v3649(VarCurr).
% 94.22/93.59  0 [] v3648(VarCurr)| -v3651(VarCurr).
% 94.22/93.59  0 [] v3651(VarCurr)|v8(VarCurr,bitIndex2).
% 94.22/93.59  0 [] -v3651(VarCurr)| -v8(VarCurr,bitIndex2).
% 94.22/93.59  0 [] v3649(VarCurr)|v3650(VarCurr).
% 94.22/93.59  0 [] -v3649(VarCurr)| -v3650(VarCurr).
% 94.22/93.59  0 [] -v3650(VarCurr)|v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] -v3650(VarCurr)|v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] v3650(VarCurr)| -v8(VarCurr,bitIndex0)| -v8(VarCurr,bitIndex1).
% 94.22/93.59  0 [] -v3644(VarCurr)| -v8(VarCurr,bitIndex2)|$T.
% 94.22/93.59  0 [] -v3644(VarCurr)|v8(VarCurr,bitIndex2)| -$T.
% 94.22/93.59  0 [] -v3644(VarCurr)| -v8(VarCurr,bitIndex1)|$F.
% 94.22/93.59  0 [] -v3644(VarCurr)|v8(VarCurr,bitIndex1)| -$F.
% 94.22/93.59  0 [] -v3644(VarCurr)| -v8(VarCurr,bitIndex0)|$T.
% 94.22/93.59  0 [] -v3644(VarCurr)|v8(VarCurr,bitIndex0)| -$T.
% 94.22/93.59  0 [] v3644(VarCurr)|v8(VarCurr,bitIndex2)|$T|v8(VarCurr,bitIndex1)|$F|v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] v3644(VarCurr)|v8(VarCurr,bitIndex2)|$T| -v8(VarCurr,bitIndex1)| -$F|v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] v3644(VarCurr)| -v8(VarCurr,bitIndex2)| -$T|v8(VarCurr,bitIndex1)|$F| -v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] v3644(VarCurr)| -v8(VarCurr,bitIndex2)| -$T| -v8(VarCurr,bitIndex1)| -$F| -v8(VarCurr,bitIndex0).
% 94.22/93.59  0 [] b101(bitIndex2).
% 94.22/93.59  0 [] -b101(bitIndex1).
% 94.22/93.60  0 [] b101(bitIndex0).
% 94.22/93.60  0 [] -v23(VarCurr)|v25(VarCurr).
% 94.22/93.60  0 [] v23(VarCurr)| -v25(VarCurr).
% 94.22/93.60  0 [] -v25(VarCurr)|v27(VarCurr).
% 94.22/93.60  0 [] v25(VarCurr)| -v27(VarCurr).
% 94.22/93.60  0 [] -v27(VarCurr)|v29(VarCurr).
% 94.22/93.60  0 [] v27(VarCurr)| -v29(VarCurr).
% 94.22/93.60  0 [] -v29(VarCurr)|v31(VarCurr,bitIndex7).
% 94.22/93.60  0 [] v29(VarCurr)| -v31(VarCurr,bitIndex7).
% 94.22/93.60  0 [] -v31(VarNext,bitIndex7)|v3635(VarNext,bitIndex6).
% 94.22/93.60  0 [] v31(VarNext,bitIndex7)| -v3635(VarNext,bitIndex6).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3635(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)|v3635(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.22/93.60  0 [] -v3636(VarNext)| -range_10_0(B)| -v3635(VarNext,B)|v1253(VarNext,B).
% 94.22/93.60  0 [] -v3636(VarNext)| -range_10_0(B)|v3635(VarNext,B)| -v1253(VarNext,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3636(VarNext)|v3637(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3636(VarNext)| -v3637(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3637(VarNext)|v3639(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3637(VarNext)|v1240(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3637(VarNext)| -v3639(VarNext)| -v1240(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3639(VarNext)|v1247(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3639(VarNext)| -v1247(VarNext).
% 94.22/93.60  0 [] v3611(VarCurr)| -v36(VarCurr,bitIndex7)|$F.
% 94.22/93.60  0 [] v3611(VarCurr)|v36(VarCurr,bitIndex7)| -$F.
% 94.22/93.60  0 [] -v3611(VarCurr)| -v36(VarCurr,bitIndex7)|$T.
% 94.22/93.60  0 [] -v3611(VarCurr)|v36(VarCurr,bitIndex7)| -$T.
% 94.22/93.60  0 [] -v3611(VarCurr)|v3612(VarCurr)|v3632(VarCurr).
% 94.22/93.60  0 [] v3611(VarCurr)| -v3612(VarCurr).
% 94.22/93.60  0 [] v3611(VarCurr)| -v3632(VarCurr).
% 94.22/93.60  0 [] -v3632(VarCurr)|v3633(VarCurr).
% 94.22/93.60  0 [] -v3632(VarCurr)|v1323(VarCurr).
% 94.22/93.60  0 [] v3632(VarCurr)| -v3633(VarCurr)| -v1323(VarCurr).
% 94.22/93.60  0 [] -v3633(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] v3633(VarCurr)| -v3619(VarCurr).
% 94.22/93.60  0 [] -v3612(VarCurr)|v3613(VarCurr)|v3630(VarCurr).
% 94.22/93.60  0 [] v3612(VarCurr)| -v3613(VarCurr).
% 94.22/93.60  0 [] v3612(VarCurr)| -v3630(VarCurr).
% 94.22/93.60  0 [] -v3630(VarCurr)|v3631(VarCurr).
% 94.22/93.60  0 [] -v3630(VarCurr)|v1300(VarCurr).
% 94.22/93.60  0 [] v3630(VarCurr)| -v3631(VarCurr)| -v1300(VarCurr).
% 94.22/93.60  0 [] -v3631(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] -v3631(VarCurr)|v1180(VarCurr).
% 94.22/93.60  0 [] v3631(VarCurr)| -v3619(VarCurr)| -v1180(VarCurr).
% 94.22/93.60  0 [] -v3613(VarCurr)|v3614(VarCurr)|v3628(VarCurr).
% 94.22/93.60  0 [] v3613(VarCurr)| -v3614(VarCurr).
% 94.22/93.60  0 [] v3613(VarCurr)| -v3628(VarCurr).
% 94.22/93.60  0 [] -v3628(VarCurr)|v3629(VarCurr).
% 94.22/93.60  0 [] -v3628(VarCurr)|v1360(VarCurr).
% 94.22/93.60  0 [] v3628(VarCurr)| -v3629(VarCurr)| -v1360(VarCurr).
% 94.22/93.60  0 [] -v3629(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] v3629(VarCurr)| -v3619(VarCurr).
% 94.22/93.60  0 [] -v3614(VarCurr)|v3615(VarCurr)|v3626(VarCurr).
% 94.22/93.60  0 [] v3614(VarCurr)| -v3615(VarCurr).
% 94.22/93.60  0 [] v3614(VarCurr)| -v3626(VarCurr).
% 94.22/93.60  0 [] -v3626(VarCurr)|v3627(VarCurr).
% 94.22/93.60  0 [] -v3626(VarCurr)|v1278(VarCurr).
% 94.22/93.60  0 [] v3626(VarCurr)| -v3627(VarCurr)| -v1278(VarCurr).
% 94.22/93.60  0 [] -v3627(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] -v3627(VarCurr)|v1180(VarCurr).
% 94.22/93.60  0 [] v3627(VarCurr)| -v3619(VarCurr)| -v1180(VarCurr).
% 94.22/93.60  0 [] -v3615(VarCurr)|v3616(VarCurr)|v3624(VarCurr).
% 94.22/93.60  0 [] v3615(VarCurr)| -v3616(VarCurr).
% 94.22/93.60  0 [] v3615(VarCurr)| -v3624(VarCurr).
% 94.22/93.60  0 [] -v3624(VarCurr)|v3625(VarCurr).
% 94.22/93.60  0 [] -v3624(VarCurr)|v1355(VarCurr).
% 94.22/93.60  0 [] v3624(VarCurr)| -v3625(VarCurr)| -v1355(VarCurr).
% 94.22/93.60  0 [] -v3625(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] v3625(VarCurr)| -v3619(VarCurr).
% 94.22/93.60  0 [] -v3616(VarCurr)|v3617(VarCurr)|v3621(VarCurr).
% 94.22/93.60  0 [] v3616(VarCurr)| -v3617(VarCurr).
% 94.22/93.60  0 [] v3616(VarCurr)| -v3621(VarCurr).
% 94.22/93.60  0 [] -v3621(VarCurr)|v3622(VarCurr).
% 94.22/93.60  0 [] -v3621(VarCurr)|v1238(VarCurr).
% 94.22/93.60  0 [] v3621(VarCurr)| -v3622(VarCurr)| -v1238(VarCurr).
% 94.22/93.60  0 [] -v3622(VarCurr)|v3619(VarCurr).
% 94.22/93.60  0 [] -v3622(VarCurr)|v1180(VarCurr).
% 94.22/93.60  0 [] v3622(VarCurr)| -v3619(VarCurr)| -v1180(VarCurr).
% 94.22/93.60  0 [] -v3619(VarCurr)|v3620(VarCurr).
% 94.22/93.60  0 [] -v3619(VarCurr)|v1347(VarCurr).
% 94.22/93.60  0 [] v3619(VarCurr)| -v3620(VarCurr)| -v1347(VarCurr).
% 94.22/93.60  0 [] -v3617(VarCurr)|v3618(VarCurr).
% 94.22/93.60  0 [] -v3617(VarCurr)|v1348(VarCurr).
% 94.22/93.60  0 [] v3617(VarCurr)| -v3618(VarCurr)| -v1348(VarCurr).
% 94.22/93.60  0 [] -v3618(VarCurr)|v3620(VarCurr).
% 94.22/93.60  0 [] -v3618(VarCurr)|v1347(VarCurr).
% 94.22/93.60  0 [] v3618(VarCurr)| -v3620(VarCurr)| -v1347(VarCurr).
% 94.22/93.60  0 [] -v3620(VarCurr)|v1673(VarCurr).
% 94.22/93.60  0 [] -v3620(VarCurr)|v903(VarCurr).
% 94.22/93.60  0 [] v3620(VarCurr)| -v1673(VarCurr)| -v903(VarCurr).
% 94.22/93.60  0 [] -v38(VarCurr)|v40(VarCurr).
% 94.22/93.60  0 [] v38(VarCurr)| -v40(VarCurr).
% 94.22/93.60  0 [] -v40(VarCurr)|v42(VarCurr).
% 94.22/93.60  0 [] v40(VarCurr)| -v42(VarCurr).
% 94.22/93.60  0 [] -v42(VarCurr)|v44(VarCurr).
% 94.22/93.60  0 [] v42(VarCurr)| -v44(VarCurr).
% 94.22/93.60  0 [] -v44(VarCurr)|v46(VarCurr).
% 94.22/93.60  0 [] v44(VarCurr)| -v46(VarCurr).
% 94.22/93.60  0 [] -v46(VarCurr)|v48(VarCurr).
% 94.22/93.60  0 [] v46(VarCurr)| -v48(VarCurr).
% 94.22/93.60  0 [] -v48(VarCurr)|v50(VarCurr).
% 94.22/93.60  0 [] v48(VarCurr)| -v50(VarCurr).
% 94.22/93.60  0 [] -v50(VarCurr)|v52(VarCurr).
% 94.22/93.60  0 [] v50(VarCurr)| -v52(VarCurr).
% 94.22/93.60  0 [] -v52(VarCurr)|v54(VarCurr).
% 94.22/93.60  0 [] v52(VarCurr)| -v54(VarCurr).
% 94.22/93.60  0 [] -v54(VarCurr)|v56(VarCurr,bitIndex2).
% 94.22/93.60  0 [] v54(VarCurr)| -v56(VarCurr,bitIndex2).
% 94.22/93.60  0 [] -v56(VarNext,bitIndex2)|v3601(VarNext,bitIndex2).
% 94.22/93.60  0 [] v56(VarNext,bitIndex2)| -v3601(VarNext,bitIndex2).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3602(VarNext)| -range_3_0(B)| -v3601(VarNext,B)|v56(VarCurr,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3602(VarNext)| -range_3_0(B)|v3601(VarNext,B)| -v56(VarCurr,B).
% 94.22/93.60  0 [] -v3602(VarNext)| -range_3_0(B)| -v3601(VarNext,B)|v3588(VarNext,B).
% 94.22/93.60  0 [] -v3602(VarNext)| -range_3_0(B)|v3601(VarNext,B)| -v3588(VarNext,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3602(VarNext)|v3603(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3602(VarNext)| -v3603(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3603(VarNext)|v3605(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3603(VarNext)|v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3603(VarNext)| -v3605(VarNext)| -v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3605(VarNext)|v3582(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3605(VarNext)| -v3582(VarNext).
% 94.22/93.60  0 [] -v67(VarCurr,bitIndex2)|v3558(VarCurr,bitIndex2).
% 94.22/93.60  0 [] v67(VarCurr,bitIndex2)| -v3558(VarCurr,bitIndex2).
% 94.22/93.60  0 [] -v3555(VarCurr,bitIndex2)|v3556(VarCurr,bitIndex2).
% 94.22/93.60  0 [] v3555(VarCurr,bitIndex2)| -v3556(VarCurr,bitIndex2).
% 94.22/93.60  0 [] -v56(VarNext,bitIndex1)|v3593(VarNext,bitIndex1).
% 94.22/93.60  0 [] v56(VarNext,bitIndex1)| -v3593(VarNext,bitIndex1).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3594(VarNext)| -range_3_0(B)| -v3593(VarNext,B)|v56(VarCurr,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3594(VarNext)| -range_3_0(B)|v3593(VarNext,B)| -v56(VarCurr,B).
% 94.22/93.60  0 [] -v3594(VarNext)| -range_3_0(B)| -v3593(VarNext,B)|v3588(VarNext,B).
% 94.22/93.60  0 [] -v3594(VarNext)| -range_3_0(B)|v3593(VarNext,B)| -v3588(VarNext,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3594(VarNext)|v3595(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3594(VarNext)| -v3595(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3595(VarNext)|v3597(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3595(VarNext)|v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3595(VarNext)| -v3597(VarNext)| -v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3597(VarNext)|v3582(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3597(VarNext)| -v3582(VarNext).
% 94.22/93.60  0 [] -v67(VarCurr,bitIndex1)|v3558(VarCurr,bitIndex1).
% 94.22/93.60  0 [] v67(VarCurr,bitIndex1)| -v3558(VarCurr,bitIndex1).
% 94.22/93.60  0 [] -v3555(VarCurr,bitIndex1)|v3556(VarCurr,bitIndex1).
% 94.22/93.60  0 [] v3555(VarCurr,bitIndex1)| -v3556(VarCurr,bitIndex1).
% 94.22/93.60  0 [] -v56(VarNext,bitIndex3)|v3577(VarNext,bitIndex3).
% 94.22/93.60  0 [] v56(VarNext,bitIndex3)| -v3577(VarNext,bitIndex3).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3578(VarNext)| -range_3_0(B)| -v3577(VarNext,B)|v56(VarCurr,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3578(VarNext)| -range_3_0(B)|v3577(VarNext,B)| -v56(VarCurr,B).
% 94.22/93.60  0 [] -v3578(VarNext)| -range_3_0(B)| -v3577(VarNext,B)|v3588(VarNext,B).
% 94.22/93.60  0 [] -v3578(VarNext)| -range_3_0(B)|v3577(VarNext,B)| -v3588(VarNext,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v3588(VarNext,B)|v3586(VarCurr,B).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v3588(VarNext,B)| -v3586(VarCurr,B).
% 94.22/93.60  0 [] v3589(VarCurr)| -range_3_0(B)| -v3586(VarCurr,B)|v67(VarCurr,B).
% 94.22/93.60  0 [] v3589(VarCurr)| -range_3_0(B)|v3586(VarCurr,B)| -v67(VarCurr,B).
% 94.22/93.60  0 [] -v3589(VarCurr)| -range_3_0(B)| -v3586(VarCurr,B)|$F.
% 94.22/93.60  0 [] -v3589(VarCurr)| -range_3_0(B)|v3586(VarCurr,B)| -$F.
% 94.22/93.60  0 [] v3589(VarCurr)|v58(VarCurr).
% 94.22/93.60  0 [] -v3589(VarCurr)| -v58(VarCurr).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3578(VarNext)|v3579(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3578(VarNext)| -v3579(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3579(VarNext)|v3580(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3579(VarNext)|v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3579(VarNext)| -v3580(VarNext)| -v3573(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3580(VarNext)|v3582(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3580(VarNext)| -v3582(VarNext).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)| -v3582(VarNext)|v3573(VarCurr).
% 94.22/93.60  0 [] -nextState(VarCurr,VarNext)|v3582(VarNext)| -v3573(VarCurr).
% 94.22/93.60  0 [] -v3573(VarCurr)|v3575(VarCurr).
% 94.22/93.60  0 [] v3573(VarCurr)| -v3575(VarCurr).
% 94.22/93.60  0 [] -v3575(VarCurr)|v3531(VarCurr).
% 94.22/93.60  0 [] v3575(VarCurr)| -v3531(VarCurr).
% 94.22/93.60  0 [] -v67(VarCurr,bitIndex3)|v3558(VarCurr,bitIndex3).
% 94.22/93.60  0 [] v67(VarCurr,bitIndex3)| -v3558(VarCurr,bitIndex3).
% 94.22/93.60  0 [] v3559(VarCurr)| -range_3_0(B)| -v3558(VarCurr,B)|v3560(VarCurr,B).
% 94.22/93.60  0 [] v3559(VarCurr)| -range_3_0(B)|v3558(VarCurr,B)| -v3560(VarCurr,B).
% 94.22/93.60  0 [] -v3559(VarCurr)| -range_3_0(B)| -v3558(VarCurr,B)|$F.
% 94.22/93.60  0 [] -v3559(VarCurr)| -range_3_0(B)|v3558(VarCurr,B)| -$F.
% 94.22/93.60  0 [] v3561(VarCurr)|v3563(VarCurr)|v3567(VarCurr)| -range_3_0(B)| -v3560(VarCurr,B)|v56(VarCurr,B).
% 94.22/93.60  0 [] v3561(VarCurr)|v3563(VarCurr)|v3567(VarCurr)| -range_3_0(B)|v3560(VarCurr,B)| -v56(VarCurr,B).
% 94.22/93.60  0 [] -v3567(VarCurr)| -range_3_0(B)| -v3560(VarCurr,B)|v3569(VarCurr,B).
% 94.22/93.60  0 [] -v3567(VarCurr)| -range_3_0(B)|v3560(VarCurr,B)| -v3569(VarCurr,B).
% 94.22/93.60  0 [] -v3563(VarCurr)| -range_3_0(B)| -v3560(VarCurr,B)|v3565(VarCurr,B).
% 94.22/93.60  0 [] -v3563(VarCurr)| -range_3_0(B)|v3560(VarCurr,B)| -v3565(VarCurr,B).
% 94.22/93.60  0 [] -v3561(VarCurr)| -range_3_0(B)| -v3560(VarCurr,B)|v56(VarCurr,B).
% 94.22/93.60  0 [] -v3561(VarCurr)| -range_3_0(B)|v3560(VarCurr,B)| -v56(VarCurr,B).
% 94.22/93.60  0 [] -v3570(VarCurr)| -v3571(VarCurr,bitIndex1)|$T.
% 94.22/93.60  0 [] -v3570(VarCurr)|v3571(VarCurr,bitIndex1)| -$T.
% 94.22/93.60  0 [] -v3570(VarCurr)| -v3571(VarCurr,bitIndex0)|$T.
% 94.22/93.60  0 [] -v3570(VarCurr)|v3571(VarCurr,bitIndex0)| -$T.
% 94.22/93.60  0 [] v3570(VarCurr)|v3571(VarCurr,bitIndex1)|$T|v3571(VarCurr,bitIndex0).
% 94.22/93.60  0 [] v3570(VarCurr)| -v3571(VarCurr,bitIndex1)| -$T| -v3571(VarCurr,bitIndex0).
% 94.22/93.60  0 [] -v3571(VarCurr,bitIndex0)|v3447(VarCurr).
% 94.22/93.60  0 [] v3571(VarCurr,bitIndex0)| -v3447(VarCurr).
% 94.22/93.60  0 [] -v3571(VarCurr,bitIndex1)|v69(VarCurr).
% 94.22/93.60  0 [] v3571(VarCurr,bitIndex1)| -v69(VarCurr).
% 94.22/93.61  0 [] -v3569(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3569(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] -range_3_1(B)| -v3569(VarCurr,B)|v3555(VarCurr,B).
% 94.22/93.61  0 [] -range_3_1(B)|v3569(VarCurr,B)| -v3555(VarCurr,B).
% 94.22/93.61  0 [] -range_3_1(B)|bitIndex1=B|bitIndex2=B|bitIndex3=B.
% 94.22/93.61  0 [] range_3_1(B)|bitIndex1!=B.
% 94.22/93.61  0 [] range_3_1(B)|bitIndex2!=B.
% 94.22/93.61  0 [] range_3_1(B)|bitIndex3!=B.
% 94.22/93.61  0 [] -v3567(VarCurr)| -v3568(VarCurr,bitIndex1)|$T.
% 94.22/93.61  0 [] -v3567(VarCurr)|v3568(VarCurr,bitIndex1)| -$T.
% 94.22/93.61  0 [] -v3567(VarCurr)| -v3568(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] -v3567(VarCurr)|v3568(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] v3567(VarCurr)|v3568(VarCurr,bitIndex1)|$T|v3568(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] v3567(VarCurr)|v3568(VarCurr,bitIndex1)|$T| -v3568(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] v3567(VarCurr)| -v3568(VarCurr,bitIndex1)| -$T|v3568(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] v3567(VarCurr)| -v3568(VarCurr,bitIndex1)| -$T| -v3568(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] -v3568(VarCurr,bitIndex0)|v3447(VarCurr).
% 94.22/93.61  0 [] v3568(VarCurr,bitIndex0)| -v3447(VarCurr).
% 94.22/93.61  0 [] -v3568(VarCurr,bitIndex1)|v69(VarCurr).
% 94.22/93.61  0 [] v3568(VarCurr,bitIndex1)| -v69(VarCurr).
% 94.22/93.61  0 [] -v3565(VarCurr,bitIndex2)|v56(VarCurr,bitIndex3).
% 94.22/93.61  0 [] v3565(VarCurr,bitIndex2)| -v56(VarCurr,bitIndex3).
% 94.22/93.61  0 [] -v3565(VarCurr,bitIndex1)|v56(VarCurr,bitIndex2).
% 94.22/93.61  0 [] v3565(VarCurr,bitIndex1)| -v56(VarCurr,bitIndex2).
% 94.22/93.61  0 [] -v3565(VarCurr,bitIndex0)|v56(VarCurr,bitIndex1).
% 94.22/93.61  0 [] v3565(VarCurr,bitIndex0)| -v56(VarCurr,bitIndex1).
% 94.22/93.61  0 [] -v3565(VarCurr,bitIndex3)|$F.
% 94.22/93.61  0 [] v3565(VarCurr,bitIndex3)| -$F.
% 94.22/93.61  0 [] -v3563(VarCurr)| -v3564(VarCurr,bitIndex1)|$F.
% 94.22/93.61  0 [] -v3563(VarCurr)|v3564(VarCurr,bitIndex1)| -$F.
% 94.22/93.61  0 [] -v3563(VarCurr)| -v3564(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] -v3563(VarCurr)|v3564(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3563(VarCurr)|v3564(VarCurr,bitIndex1)|$F|v3564(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3563(VarCurr)|v3564(VarCurr,bitIndex1)|$F| -v3564(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3563(VarCurr)| -v3564(VarCurr,bitIndex1)| -$F|v3564(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3563(VarCurr)| -v3564(VarCurr,bitIndex1)| -$F| -v3564(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] -v3564(VarCurr,bitIndex0)|v3447(VarCurr).
% 94.22/93.61  0 [] v3564(VarCurr,bitIndex0)| -v3447(VarCurr).
% 94.22/93.61  0 [] -v3564(VarCurr,bitIndex1)|v69(VarCurr).
% 94.22/93.61  0 [] v3564(VarCurr,bitIndex1)| -v69(VarCurr).
% 94.22/93.61  0 [] -v3561(VarCurr)| -v3562(VarCurr,bitIndex1)|$F.
% 94.22/93.61  0 [] -v3561(VarCurr)|v3562(VarCurr,bitIndex1)| -$F.
% 94.22/93.61  0 [] -v3561(VarCurr)| -v3562(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] -v3561(VarCurr)|v3562(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] v3561(VarCurr)|v3562(VarCurr,bitIndex1)|$F|v3562(VarCurr,bitIndex0).
% 94.22/93.61  0 [] v3561(VarCurr)| -v3562(VarCurr,bitIndex1)| -$F| -v3562(VarCurr,bitIndex0).
% 94.22/93.61  0 [] -v3562(VarCurr,bitIndex0)|v3447(VarCurr).
% 94.22/93.61  0 [] v3562(VarCurr,bitIndex0)| -v3447(VarCurr).
% 94.22/93.61  0 [] -v3562(VarCurr,bitIndex1)|v69(VarCurr).
% 94.22/93.61  0 [] v3562(VarCurr,bitIndex1)| -v69(VarCurr).
% 94.22/93.61  0 [] v3559(VarCurr)|v58(VarCurr).
% 94.22/93.61  0 [] -v3559(VarCurr)| -v58(VarCurr).
% 94.22/93.61  0 [] -v3555(VarCurr,bitIndex3)|v3556(VarCurr,bitIndex3).
% 94.22/93.61  0 [] v3555(VarCurr,bitIndex3)| -v3556(VarCurr,bitIndex3).
% 94.22/93.61  0 [] -v3556(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] v3556(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] -v3556(VarCurr,bitIndex3)|v56(VarCurr,bitIndex2).
% 94.22/93.61  0 [] v3556(VarCurr,bitIndex3)| -v56(VarCurr,bitIndex2).
% 94.22/93.61  0 [] -v3556(VarCurr,bitIndex2)|v56(VarCurr,bitIndex1).
% 94.22/93.61  0 [] v3556(VarCurr,bitIndex2)| -v56(VarCurr,bitIndex1).
% 94.22/93.61  0 [] -v3556(VarCurr,bitIndex1)|v56(VarCurr,bitIndex0).
% 94.22/93.61  0 [] v3556(VarCurr,bitIndex1)| -v56(VarCurr,bitIndex0).
% 94.22/93.61  0 [] -range_3_0(B)| -v56(constB0,B)|$F.
% 94.22/93.61  0 [] -range_3_0(B)|v56(constB0,B)| -$F.
% 94.22/93.61  0 [] -v3447(VarCurr)|v3449(VarCurr).
% 94.22/93.61  0 [] v3447(VarCurr)| -v3449(VarCurr).
% 94.22/93.61  0 [] -v3449(VarCurr)|v3451(VarCurr).
% 94.22/93.61  0 [] v3449(VarCurr)| -v3451(VarCurr).
% 94.22/93.61  0 [] v3551(VarCurr)|v3552(VarCurr)| -v3451(VarCurr)|$F.
% 94.22/93.61  0 [] v3551(VarCurr)|v3552(VarCurr)|v3451(VarCurr)| -$F.
% 94.22/93.61  0 [] -v3552(VarCurr)| -v3451(VarCurr)|$T.
% 94.22/93.61  0 [] -v3552(VarCurr)|v3451(VarCurr)| -$T.
% 94.22/93.61  0 [] -v3551(VarCurr)| -v3451(VarCurr)|$F.
% 94.22/93.61  0 [] -v3551(VarCurr)|v3451(VarCurr)| -$F.
% 94.22/93.61  0 [] -v3552(VarCurr)| -v3453(VarCurr,bitIndex1)|$F.
% 94.22/93.61  0 [] -v3552(VarCurr)|v3453(VarCurr,bitIndex1)| -$F.
% 94.22/93.61  0 [] -v3552(VarCurr)| -v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] -v3552(VarCurr)|v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3552(VarCurr)|v3453(VarCurr,bitIndex1)|$F|v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3552(VarCurr)|v3453(VarCurr,bitIndex1)|$F| -v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3552(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F|v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3552(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F| -v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] -v3551(VarCurr)| -v3453(VarCurr,bitIndex1)|$F.
% 94.22/93.61  0 [] -v3551(VarCurr)|v3453(VarCurr,bitIndex1)| -$F.
% 94.22/93.61  0 [] -v3551(VarCurr)| -v3453(VarCurr,bitIndex0)|$F.
% 94.22/93.61  0 [] -v3551(VarCurr)|v3453(VarCurr,bitIndex0)| -$F.
% 94.22/93.61  0 [] v3551(VarCurr)|v3453(VarCurr,bitIndex1)|$F|v3453(VarCurr,bitIndex0).
% 94.22/93.61  0 [] v3551(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F| -v3453(VarCurr,bitIndex0).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3536(VarNext)| -range_1_0(B)| -v3453(VarNext,B)|v3453(VarCurr,B).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3536(VarNext)| -range_1_0(B)|v3453(VarNext,B)| -v3453(VarCurr,B).
% 94.22/93.61  0 [] -v3536(VarNext)| -range_1_0(B)| -v3453(VarNext,B)|v3546(VarNext,B).
% 94.22/93.61  0 [] -v3536(VarNext)| -range_1_0(B)|v3453(VarNext,B)| -v3546(VarNext,B).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -range_1_0(B)| -v3546(VarNext,B)|v3544(VarCurr,B).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -range_1_0(B)|v3546(VarNext,B)| -v3544(VarCurr,B).
% 94.22/93.61  0 [] v3547(VarCurr)| -range_1_0(B)| -v3544(VarCurr,B)|v3455(VarCurr,B).
% 94.22/93.61  0 [] v3547(VarCurr)| -range_1_0(B)|v3544(VarCurr,B)| -v3455(VarCurr,B).
% 94.22/93.61  0 [] -v3547(VarCurr)| -range_1_0(B)| -v3544(VarCurr,B)|$F.
% 94.22/93.61  0 [] -v3547(VarCurr)| -range_1_0(B)|v3544(VarCurr,B)| -$F.
% 94.22/93.61  0 [] -v3547(VarCurr)| -v62(VarCurr)|$F.
% 94.22/93.61  0 [] -v3547(VarCurr)|v62(VarCurr)| -$F.
% 94.22/93.61  0 [] v3547(VarCurr)|v62(VarCurr)|$F.
% 94.22/93.61  0 [] v3547(VarCurr)| -v62(VarCurr)| -$F.
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -v3536(VarNext)|v3537(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3536(VarNext)| -v3537(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -v3537(VarNext)|v3538(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -v3537(VarNext)|v3531(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3537(VarNext)| -v3538(VarNext)| -v3531(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3538(VarNext)|v3540(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -v3538(VarNext)| -v3540(VarNext).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)| -v3540(VarNext)|v3531(VarCurr).
% 94.22/93.61  0 [] -nextState(VarCurr,VarNext)|v3540(VarNext)| -v3531(VarCurr).
% 94.22/93.61  0 [] -v3531(VarCurr)|v3533(VarCurr).
% 94.22/93.61  0 [] v3531(VarCurr)| -v3533(VarCurr).
% 94.22/93.61  0 [] -v3533(VarCurr)|v1(VarCurr).
% 94.22/93.61  0 [] v3533(VarCurr)| -v1(VarCurr).
% 94.22/93.61  0 [] v3520(VarCurr)|v3529(VarCurr)| -range_1_0(B)| -v3455(VarCurr,B)|$F.
% 94.22/93.61  0 [] v3520(VarCurr)|v3529(VarCurr)| -range_1_0(B)|v3455(VarCurr,B)| -$F.
% 94.22/93.61  0 [] -v3529(VarCurr)| -range_1_0(B)| -v3455(VarCurr,B)|$F.
% 94.22/93.61  0 [] -v3529(VarCurr)| -range_1_0(B)|v3455(VarCurr,B)| -$F.
% 94.22/93.61  0 [] -v3520(VarCurr)| -range_1_0(B)| -v3455(VarCurr,B)|v3521(VarCurr,B).
% 94.22/93.61  0 [] -v3520(VarCurr)| -range_1_0(B)|v3455(VarCurr,B)| -v3521(VarCurr,B).
% 94.22/93.61  0 [] -v3529(VarCurr)| -v3453(VarCurr,bitIndex1)|$F.
% 94.22/93.61  0 [] -v3529(VarCurr)|v3453(VarCurr,bitIndex1)| -$F.
% 94.22/93.61  0 [] -v3529(VarCurr)| -v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] -v3529(VarCurr)|v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3529(VarCurr)|v3453(VarCurr,bitIndex1)|$F|v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3529(VarCurr)|v3453(VarCurr,bitIndex1)|$F| -v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3529(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F|v3453(VarCurr,bitIndex0)|$T.
% 94.22/93.61  0 [] v3529(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F| -v3453(VarCurr,bitIndex0)| -$T.
% 94.22/93.61  0 [] v3522(VarCurr)| -range_1_0(B)| -v3521(VarCurr,B)|v3524(VarCurr,B).
% 94.22/93.61  0 [] v3522(VarCurr)| -range_1_0(B)|v3521(VarCurr,B)| -v3524(VarCurr,B).
% 94.22/93.61  0 [] -v3522(VarCurr)| -range_1_0(B)| -v3521(VarCurr,B)|$F.
% 94.22/93.61  0 [] -v3522(VarCurr)| -range_1_0(B)|v3521(VarCurr,B)| -$F.
% 94.22/93.61  0 [] v3525(VarCurr)| -range_1_0(B)| -v3524(VarCurr,B)|b01(B).
% 94.22/93.61  0 [] v3525(VarCurr)| -range_1_0(B)|v3524(VarCurr,B)| -b01(B).
% 94.22/93.61  0 [] -v3525(VarCurr)| -range_1_0(B)| -v3524(VarCurr,B)|$F.
% 94.22/93.61  0 [] -v3525(VarCurr)| -range_1_0(B)|v3524(VarCurr,B)| -$F.
% 94.22/93.61  0 [] -v3527(VarCurr)| -v3528(VarCurr)|$F.
% 94.22/93.61  0 [] -v3527(VarCurr)|v3528(VarCurr)| -$F.
% 94.22/93.61  0 [] v3527(VarCurr)|v3528(VarCurr)|$F.
% 94.22/93.61  0 [] v3527(VarCurr)| -v3528(VarCurr)| -$F.
% 94.22/93.61  0 [] -v3528(VarCurr)|v3500(VarCurr)|v3502(VarCurr).
% 94.22/93.61  0 [] v3528(VarCurr)| -v3500(VarCurr).
% 94.22/93.61  0 [] v3528(VarCurr)| -v3502(VarCurr).
% 94.22/93.61  0 [] -v3525(VarCurr)| -v3526(VarCurr)|$T.
% 94.22/93.62  0 [] -v3525(VarCurr)|v3526(VarCurr)| -$T.
% 94.22/93.62  0 [] v3525(VarCurr)|v3526(VarCurr)|$T.
% 94.22/93.62  0 [] v3525(VarCurr)| -v3526(VarCurr)| -$T.
% 94.22/93.62  0 [] -v3526(VarCurr)|v3500(VarCurr)|v3502(VarCurr).
% 94.22/93.62  0 [] v3526(VarCurr)| -v3500(VarCurr).
% 94.22/93.62  0 [] v3526(VarCurr)| -v3502(VarCurr).
% 94.22/93.62  0 [] -v3500(constB0)|$F.
% 94.22/93.62  0 [] v3500(constB0)| -$F.
% 94.22/93.62  0 [] -v3523(VarCurr)| -v3457(VarCurr)|$F.
% 94.22/93.62  0 [] -v3523(VarCurr)|v3457(VarCurr)| -$F.
% 94.22/93.62  0 [] v3523(VarCurr)|v3457(VarCurr)|$F.
% 94.22/93.62  0 [] v3523(VarCurr)| -v3457(VarCurr)| -$F.
% 94.22/93.62  0 [] -v3522(VarCurr)| -v3457(VarCurr)|$T.
% 94.22/93.62  0 [] -v3522(VarCurr)|v3457(VarCurr)| -$T.
% 94.22/93.62  0 [] v3522(VarCurr)|v3457(VarCurr)|$T.
% 94.22/93.62  0 [] v3522(VarCurr)| -v3457(VarCurr)| -$T.
% 94.22/93.62  0 [] -v3520(VarCurr)| -v3453(VarCurr,bitIndex1)|$F.
% 94.22/93.62  0 [] -v3520(VarCurr)|v3453(VarCurr,bitIndex1)| -$F.
% 94.22/93.62  0 [] -v3520(VarCurr)| -v3453(VarCurr,bitIndex0)|$F.
% 94.22/93.62  0 [] -v3520(VarCurr)|v3453(VarCurr,bitIndex0)| -$F.
% 94.22/93.62  0 [] v3520(VarCurr)|v3453(VarCurr,bitIndex1)|$F|v3453(VarCurr,bitIndex0).
% 94.22/93.62  0 [] v3520(VarCurr)| -v3453(VarCurr,bitIndex1)| -$F| -v3453(VarCurr,bitIndex0).
% 94.22/93.62  0 [] -range_1_0(B)| -v3453(constB0,B)|$F.
% 94.22/93.62  0 [] -range_1_0(B)|v3453(constB0,B)| -$F.
% 94.22/93.62  0 [] -v3502(VarCurr)|v3504(VarCurr).
% 94.22/93.62  0 [] v3502(VarCurr)| -v3504(VarCurr).
% 94.22/93.62  0 [] -v3504(VarCurr)|v3506(VarCurr).
% 94.22/93.62  0 [] v3504(VarCurr)| -v3506(VarCurr).
% 94.22/93.62  0 [] -v3506(VarCurr)|v3508(VarCurr).
% 94.22/93.62  0 [] v3506(VarCurr)| -v3508(VarCurr).
% 94.22/93.62  0 [] -v3508(VarCurr)|v3510(VarCurr).
% 94.22/93.62  0 [] v3508(VarCurr)| -v3510(VarCurr).
% 94.22/93.62  0 [] -v3510(VarCurr)|v3512(VarCurr).
% 94.22/93.62  0 [] v3510(VarCurr)| -v3512(VarCurr).
% 94.22/93.62  0 [] -v3512(VarCurr)|v3514(VarCurr).
% 94.22/93.62  0 [] v3512(VarCurr)| -v3514(VarCurr).
% 94.22/93.62  0 [] -v3514(VarCurr)|v3516(VarCurr,bitIndex6).
% 94.22/93.62  0 [] v3514(VarCurr)| -v3516(VarCurr,bitIndex6).
% 94.22/93.62  0 [] -v3516(constB0,bitIndex6).
% 94.22/93.62  0 [] -bx0xxxxxx(bitIndex6).
% 94.22/93.62  0 [] -v3457(VarCurr)|v3459(VarCurr).
% 94.22/93.62  0 [] v3457(VarCurr)| -v3459(VarCurr).
% 94.22/93.62  0 [] -v3459(VarCurr)|v3493(VarCurr).
% 94.22/93.62  0 [] -v3459(VarCurr)|v3489(VarCurr).
% 94.22/93.62  0 [] v3459(VarCurr)| -v3493(VarCurr)| -v3489(VarCurr).
% 94.22/93.62  0 [] -v3493(VarCurr)|v3494(VarCurr).
% 94.22/93.62  0 [] -v3493(VarCurr)|v3485(VarCurr).
% 94.22/93.62  0 [] v3493(VarCurr)| -v3494(VarCurr)| -v3485(VarCurr).
% 94.22/93.62  0 [] -v3494(VarCurr)|v3495(VarCurr).
% 94.22/93.62  0 [] -v3494(VarCurr)|v3481(VarCurr).
% 94.22/93.62  0 [] v3494(VarCurr)| -v3495(VarCurr)| -v3481(VarCurr).
% 94.22/93.62  0 [] -v3495(VarCurr)|v3496(VarCurr).
% 94.22/93.62  0 [] -v3495(VarCurr)|v3477(VarCurr).
% 94.22/93.62  0 [] v3495(VarCurr)| -v3496(VarCurr)| -v3477(VarCurr).
% 94.22/93.62  0 [] -v3496(VarCurr)|v3497(VarCurr).
% 94.22/93.62  0 [] -v3496(VarCurr)|v3473(VarCurr).
% 94.22/93.62  0 [] v3496(VarCurr)| -v3497(VarCurr)| -v3473(VarCurr).
% 94.22/93.62  0 [] -v3497(VarCurr)|v3498(VarCurr).
% 94.22/93.62  0 [] -v3497(VarCurr)|v3469(VarCurr).
% 94.22/93.62  0 [] v3497(VarCurr)| -v3498(VarCurr)| -v3469(VarCurr).
% 94.22/93.62  0 [] -v3498(VarCurr)|v3461(VarCurr).
% 94.22/93.62  0 [] -v3498(VarCurr)|v3465(VarCurr).
% 94.22/93.62  0 [] v3498(VarCurr)| -v3461(VarCurr)| -v3465(VarCurr).
% 94.22/93.62  0 [] -v3489(VarCurr)|v3491(VarCurr).
% 94.22/93.62  0 [] v3489(VarCurr)| -v3491(VarCurr).
% 94.22/93.62  0 [] -v3491(constB0)|$T.
% 94.22/93.62  0 [] v3491(constB0)| -$T.
% 94.22/93.62  0 [] -v3485(VarCurr)|v3487(VarCurr).
% 94.22/93.62  0 [] v3485(VarCurr)| -v3487(VarCurr).
% 94.22/93.62  0 [] -v3487(constB0)|$T.
% 94.22/93.62  0 [] v3487(constB0)| -$T.
% 94.22/93.62  0 [] -v3481(VarCurr)|v3483(VarCurr).
% 94.22/93.62  0 [] v3481(VarCurr)| -v3483(VarCurr).
% 94.22/93.62  0 [] -v3483(constB0)|$T.
% 94.22/93.62  0 [] v3483(constB0)| -$T.
% 94.22/93.62  0 [] -v3477(VarCurr)|v3479(VarCurr).
% 94.22/93.62  0 [] v3477(VarCurr)| -v3479(VarCurr).
% 94.22/93.62  0 [] -v3479(constB0)|$T.
% 94.22/93.62  0 [] v3479(constB0)| -$T.
% 94.22/93.62  0 [] -v3473(VarCurr)|v3475(VarCurr).
% 94.22/93.62  0 [] v3473(VarCurr)| -v3475(VarCurr).
% 94.22/93.62  0 [] -v3475(constB0)|$T.
% 94.22/93.62  0 [] v3475(constB0)| -$T.
% 94.22/93.62  0 [] -v3469(VarCurr)|v3471(VarCurr).
% 94.22/93.62  0 [] v3469(VarCurr)| -v3471(VarCurr).
% 94.22/93.62  0 [] -v3471(constB0)|$T.
% 94.22/93.62  0 [] v3471(constB0)| -$T.
% 94.22/93.62  0 [] -v3465(VarCurr)|v3467(VarCurr).
% 94.22/93.62  0 [] v3465(VarCurr)| -v3467(VarCurr).
% 94.22/93.62  0 [] -v3467(constB0)|$T.
% 94.22/93.62  0 [] v3467(constB0)| -$T.
% 94.22/93.62  0 [] -v3461(VarCurr)|v3463(VarCurr).
% 94.22/93.62  0 [] v3461(VarCurr)| -v3463(VarCurr).
% 94.22/93.62  0 [] -v3463(constB0)|$T.
% 94.22/93.62  0 [] v3463(constB0)| -$T.
% 94.22/93.62  0 [] -v69(VarCurr)|v71(VarCurr).
% 94.22/93.62  0 [] v69(VarCurr)| -v71(VarCurr).
% 94.22/93.62  0 [] -v71(VarCurr)|v73(VarCurr).
% 94.22/93.62  0 [] v71(VarCurr)| -v73(VarCurr).
% 94.22/93.62  0 [] -v73(VarCurr)|v75(VarCurr).
% 94.22/93.62  0 [] v73(VarCurr)| -v75(VarCurr).
% 94.22/93.62  0 [] -v75(VarCurr)|v77(VarCurr).
% 94.22/93.62  0 [] v75(VarCurr)| -v77(VarCurr).
% 94.22/93.62  0 [] -v77(VarCurr)|v79(VarCurr).
% 94.22/93.62  0 [] v77(VarCurr)| -v79(VarCurr).
% 94.22/93.62  0 [] -v79(VarCurr)|v81(VarCurr).
% 94.22/93.62  0 [] v79(VarCurr)| -v81(VarCurr).
% 94.22/93.62  0 [] -v81(VarCurr)|v83(VarCurr).
% 94.22/93.62  0 [] v81(VarCurr)| -v83(VarCurr).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3426(VarNext)| -v83(VarNext)|v83(VarCurr).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3426(VarNext)|v83(VarNext)| -v83(VarCurr).
% 94.22/93.62  0 [] -v3426(VarNext)| -v83(VarNext)|v3434(VarNext).
% 94.22/93.62  0 [] -v3426(VarNext)|v83(VarNext)| -v3434(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)| -v3434(VarNext)|v3432(VarCurr).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3434(VarNext)| -v3432(VarCurr).
% 94.22/93.62  0 [] v3435(VarCurr)| -v3432(VarCurr)|v3436(VarCurr).
% 94.22/93.62  0 [] v3435(VarCurr)|v3432(VarCurr)| -v3436(VarCurr).
% 94.22/93.62  0 [] -v3435(VarCurr)| -v3432(VarCurr)|$F.
% 94.22/93.62  0 [] -v3435(VarCurr)|v3432(VarCurr)| -$F.
% 94.22/93.62  0 [] v3437(VarCurr)| -v3436(VarCurr)|$F.
% 94.22/93.62  0 [] v3437(VarCurr)|v3436(VarCurr)| -$F.
% 94.22/93.62  0 [] -v3437(VarCurr)| -v3436(VarCurr)|$T.
% 94.22/93.62  0 [] -v3437(VarCurr)|v3436(VarCurr)| -$T.
% 94.22/93.62  0 [] -v3437(VarCurr)|v3438(VarCurr)|v3442(VarCurr).
% 94.22/93.62  0 [] v3437(VarCurr)| -v3438(VarCurr).
% 94.22/93.62  0 [] v3437(VarCurr)| -v3442(VarCurr).
% 94.22/93.62  0 [] -v3442(VarCurr)|v31(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v3442(VarCurr)|v3443(VarCurr).
% 94.22/93.62  0 [] v3442(VarCurr)| -v31(VarCurr,bitIndex9)| -v3443(VarCurr).
% 94.22/93.62  0 [] v3443(VarCurr)|v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v3443(VarCurr)| -v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v3438(VarCurr)|v3439(VarCurr)|v3420(VarCurr).
% 94.22/93.62  0 [] v3438(VarCurr)| -v3439(VarCurr).
% 94.22/93.62  0 [] v3438(VarCurr)| -v3420(VarCurr).
% 94.22/93.62  0 [] -v3439(VarCurr)|v3440(VarCurr)|v3415(VarCurr).
% 94.22/93.62  0 [] v3439(VarCurr)| -v3440(VarCurr).
% 94.22/93.62  0 [] v3439(VarCurr)| -v3415(VarCurr).
% 94.22/93.62  0 [] -v3440(VarCurr)|v3441(VarCurr)|v879(VarCurr).
% 94.22/93.62  0 [] v3440(VarCurr)| -v3441(VarCurr).
% 94.22/93.62  0 [] v3440(VarCurr)| -v879(VarCurr).
% 94.22/93.62  0 [] -v3441(VarCurr)|v85(VarCurr)|v3410(VarCurr).
% 94.22/93.62  0 [] v3441(VarCurr)| -v85(VarCurr).
% 94.22/93.62  0 [] v3441(VarCurr)| -v3410(VarCurr).
% 94.22/93.62  0 [] v3435(VarCurr)|v33(VarCurr).
% 94.22/93.62  0 [] -v3435(VarCurr)| -v33(VarCurr).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)| -v3426(VarNext)|v3427(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3426(VarNext)| -v3427(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)| -v3427(VarNext)|v3428(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)| -v3427(VarNext)|v1240(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3427(VarNext)| -v3428(VarNext)| -v1240(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)|v3428(VarNext)|v1247(VarNext).
% 94.22/93.62  0 [] -nextState(VarCurr,VarNext)| -v3428(VarNext)| -v1247(VarNext).
% 94.22/93.62  0 [] -v3420(VarCurr)|v31(VarCurr,bitIndex8).
% 94.22/93.62  0 [] -v3420(VarCurr)|v3422(VarCurr).
% 94.22/93.62  0 [] v3420(VarCurr)| -v31(VarCurr,bitIndex8)| -v3422(VarCurr).
% 94.22/93.62  0 [] v3422(VarCurr)|v3423(VarCurr).
% 94.22/93.62  0 [] -v3422(VarCurr)| -v3423(VarCurr).
% 94.22/93.62  0 [] -v3423(VarCurr)|v36(VarCurr,bitIndex8)|v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] v3423(VarCurr)| -v36(VarCurr,bitIndex8).
% 94.22/93.62  0 [] v3423(VarCurr)| -v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v3415(VarCurr)|v31(VarCurr,bitIndex5).
% 94.22/93.62  0 [] -v3415(VarCurr)|v3417(VarCurr).
% 94.22/93.62  0 [] v3415(VarCurr)| -v31(VarCurr,bitIndex5)| -v3417(VarCurr).
% 94.22/93.62  0 [] v3417(VarCurr)|v3418(VarCurr).
% 94.22/93.62  0 [] -v3417(VarCurr)| -v3418(VarCurr).
% 94.22/93.62  0 [] -v3418(VarCurr)|v36(VarCurr,bitIndex5)|v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] v3418(VarCurr)| -v36(VarCurr,bitIndex5).
% 94.22/93.62  0 [] v3418(VarCurr)| -v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v3410(VarCurr)|v31(VarCurr,bitIndex2).
% 94.22/93.62  0 [] -v3410(VarCurr)|v3412(VarCurr).
% 94.22/93.62  0 [] v3410(VarCurr)| -v31(VarCurr,bitIndex2)| -v3412(VarCurr).
% 94.22/93.62  0 [] v3412(VarCurr)|v3413(VarCurr).
% 94.22/93.62  0 [] -v3412(VarCurr)| -v3413(VarCurr).
% 94.22/93.62  0 [] -v3413(VarCurr)|v36(VarCurr,bitIndex2)|v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] v3413(VarCurr)| -v36(VarCurr,bitIndex2).
% 94.22/93.62  0 [] v3413(VarCurr)| -v36(VarCurr,bitIndex9).
% 94.22/93.62  0 [] -v85(VarCurr)|v36(VarCurr,bitIndex3).
% 94.22/93.62  0 [] v85(VarCurr)| -v36(VarCurr,bitIndex3).
% 94.22/93.62  0 [] v3398(VarCurr)| -v36(VarCurr,bitIndex3)|$F.
% 94.22/93.62  0 [] v3398(VarCurr)|v36(VarCurr,bitIndex3)| -$F.
% 94.22/93.62  0 [] -v3398(VarCurr)| -v36(VarCurr,bitIndex3)|$T.
% 94.22/93.62  0 [] -v3398(VarCurr)|v36(VarCurr,bitIndex3)| -$T.
% 94.22/93.62  0 [] -v3398(VarCurr)|v3399(VarCurr)|v3407(VarCurr).
% 94.22/93.62  0 [] v3398(VarCurr)| -v3399(VarCurr).
% 94.22/93.62  0 [] v3398(VarCurr)| -v3407(VarCurr).
% 94.22/93.62  0 [] -v3407(VarCurr)|v3408(VarCurr).
% 94.22/93.62  0 [] -v3407(VarCurr)|v3348(VarCurr).
% 94.22/93.62  0 [] v3407(VarCurr)| -v3408(VarCurr)| -v3348(VarCurr).
% 94.22/93.62  0 [] v3408(VarCurr)|v38(VarCurr).
% 94.22/93.62  0 [] -v3408(VarCurr)| -v38(VarCurr).
% 94.22/93.62  0 [] -v3399(VarCurr)|v3400(VarCurr)|v3405(VarCurr).
% 94.22/93.62  0 [] v3399(VarCurr)| -v3400(VarCurr).
% 94.22/93.62  0 [] v3399(VarCurr)| -v3405(VarCurr).
% 94.22/93.63  0 [] -v3405(VarCurr)|v3406(VarCurr).
% 94.22/93.63  0 [] -v3405(VarCurr)|v1360(VarCurr).
% 94.22/93.63  0 [] v3405(VarCurr)| -v3406(VarCurr)| -v1360(VarCurr).
% 94.22/93.63  0 [] -v3406(VarCurr)|v3346(VarCurr).
% 94.22/93.63  0 [] -v3406(VarCurr)|v1682(VarCurr).
% 94.22/93.63  0 [] v3406(VarCurr)| -v3346(VarCurr)| -v1682(VarCurr).
% 94.22/93.63  0 [] -v3400(VarCurr)|v3401(VarCurr)|v3403(VarCurr).
% 94.22/93.63  0 [] v3400(VarCurr)| -v3401(VarCurr).
% 94.22/93.63  0 [] v3400(VarCurr)| -v3403(VarCurr).
% 94.22/93.63  0 [] -v3403(VarCurr)|v3404(VarCurr).
% 94.22/93.63  0 [] -v3403(VarCurr)|v1355(VarCurr).
% 94.22/93.63  0 [] v3403(VarCurr)| -v3404(VarCurr)| -v1355(VarCurr).
% 94.22/93.63  0 [] -v3404(VarCurr)|v3346(VarCurr).
% 94.22/93.63  0 [] -v3404(VarCurr)|v1682(VarCurr).
% 94.22/93.63  0 [] v3404(VarCurr)| -v3346(VarCurr)| -v1682(VarCurr).
% 94.22/93.63  0 [] -v3401(VarCurr)|v3402(VarCurr).
% 94.22/93.63  0 [] -v3401(VarCurr)|v1348(VarCurr).
% 94.22/93.63  0 [] v3401(VarCurr)| -v3402(VarCurr)| -v1348(VarCurr).
% 94.22/93.63  0 [] -v3402(VarCurr)|v3346(VarCurr).
% 94.22/93.63  0 [] -v3402(VarCurr)|v1682(VarCurr).
% 94.22/93.63  0 [] v3402(VarCurr)| -v3346(VarCurr)| -v1682(VarCurr).
% 94.22/93.63  0 [] -v87(VarCurr)|v89(VarCurr).
% 94.22/93.63  0 [] v87(VarCurr)| -v89(VarCurr).
% 94.22/93.63  0 [] -v89(VarCurr)|v91(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v89(VarCurr)| -v91(VarCurr,bitIndex0).
% 94.22/93.63  0 [] -v91(VarCurr,bitIndex0)|v898(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v91(VarCurr,bitIndex0)| -v898(VarCurr,bitIndex0).
% 94.22/93.63  0 [] -v892(VarCurr,bitIndex0)|v896(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v892(VarCurr,bitIndex0)| -v896(VarCurr,bitIndex0).
% 94.22/93.63  0 [] -v885(VarCurr,bitIndex0)|v889(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v885(VarCurr,bitIndex0)| -v889(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v93(VarCurr)|v3396(VarCurr).
% 94.22/93.63  0 [] -v93(VarCurr)| -v3396(VarCurr).
% 94.22/93.63  0 [] -v3396(VarCurr)|v3358(VarCurr)|v95(VarCurr,bitIndex2).
% 94.22/93.63  0 [] v3396(VarCurr)| -v3358(VarCurr).
% 94.22/93.63  0 [] v3396(VarCurr)| -v95(VarCurr,bitIndex2).
% 94.22/93.63  0 [] -range_2_0(B)| -v95(VarCurr,B)|v97(VarCurr,B).
% 94.22/93.63  0 [] -range_2_0(B)| -v95(VarCurr,B)|v3309(VarCurr,B).
% 94.22/93.63  0 [] -range_2_0(B)|v95(VarCurr,B)| -v97(VarCurr,B)| -v3309(VarCurr,B).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3371(VarNext)| -range_2_0(B)| -v3309(VarNext,B)|v3309(VarCurr,B).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3371(VarNext)| -range_2_0(B)|v3309(VarNext,B)| -v3309(VarCurr,B).
% 94.22/93.63  0 [] -v3371(VarNext)| -range_2_0(B)| -v3309(VarNext,B)|v3390(VarNext,B).
% 94.22/93.63  0 [] -v3371(VarNext)| -range_2_0(B)|v3309(VarNext,B)| -v3390(VarNext,B).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -range_2_0(B)| -v3390(VarNext,B)|v3388(VarCurr,B).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -range_2_0(B)|v3390(VarNext,B)| -v3388(VarCurr,B).
% 94.22/93.63  0 [] v3382(VarCurr)| -range_2_0(B)| -v3388(VarCurr,B)|v3391(VarCurr,B).
% 94.22/93.63  0 [] v3382(VarCurr)| -range_2_0(B)|v3388(VarCurr,B)| -v3391(VarCurr,B).
% 94.22/93.63  0 [] -v3382(VarCurr)| -range_2_0(B)| -v3388(VarCurr,B)|$T.
% 94.22/93.63  0 [] -v3382(VarCurr)| -range_2_0(B)|v3388(VarCurr,B)| -$T.
% 94.22/93.63  0 [] v3313(VarCurr)| -range_2_0(B)| -v3391(VarCurr,B)|v887(VarCurr,B).
% 94.22/93.63  0 [] v3313(VarCurr)| -range_2_0(B)|v3391(VarCurr,B)| -v887(VarCurr,B).
% 94.22/93.63  0 [] -v3313(VarCurr)| -range_2_0(B)| -v3391(VarCurr,B)|v894(VarCurr,B).
% 94.22/93.63  0 [] -v3313(VarCurr)| -range_2_0(B)|v3391(VarCurr,B)| -v894(VarCurr,B).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3371(VarNext)|v3372(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3371(VarNext)|v3381(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3371(VarNext)| -v3372(VarNext)| -v3381(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3381(VarNext)|v3379(VarCurr).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3381(VarNext)| -v3379(VarCurr).
% 94.22/93.63  0 [] -v3379(VarCurr)|v3382(VarCurr)|v3383(VarCurr).
% 94.22/93.63  0 [] v3379(VarCurr)| -v3382(VarCurr).
% 94.22/93.63  0 [] v3379(VarCurr)| -v3383(VarCurr).
% 94.22/93.63  0 [] -v3383(VarCurr)|v3384(VarCurr).
% 94.22/93.63  0 [] -v3383(VarCurr)|v3387(VarCurr).
% 94.22/93.63  0 [] v3383(VarCurr)| -v3384(VarCurr)| -v3387(VarCurr).
% 94.22/93.63  0 [] v3387(VarCurr)|v3382(VarCurr).
% 94.22/93.63  0 [] -v3387(VarCurr)| -v3382(VarCurr).
% 94.22/93.63  0 [] -v3384(VarCurr)|v3313(VarCurr)|v3385(VarCurr).
% 94.22/93.63  0 [] v3384(VarCurr)| -v3313(VarCurr).
% 94.22/93.63  0 [] v3384(VarCurr)| -v3385(VarCurr).
% 94.22/93.63  0 [] -v3385(VarCurr)|v3361(VarCurr).
% 94.22/93.63  0 [] -v3385(VarCurr)|v3386(VarCurr).
% 94.22/93.63  0 [] v3385(VarCurr)| -v3361(VarCurr)| -v3386(VarCurr).
% 94.22/93.63  0 [] v3386(VarCurr)|v3313(VarCurr).
% 94.22/93.63  0 [] -v3386(VarCurr)| -v3313(VarCurr).
% 94.22/93.63  0 [] v3382(VarCurr)|v3311(VarCurr).
% 94.22/93.63  0 [] -v3382(VarCurr)| -v3311(VarCurr).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3372(VarNext)|v3373(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3372(VarNext)|v3368(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3372(VarNext)| -v3373(VarNext)| -v3368(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3373(VarNext)|v3375(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3373(VarNext)| -v3375(VarNext).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)| -v3375(VarNext)|v3368(VarCurr).
% 94.22/93.63  0 [] -nextState(VarCurr,VarNext)|v3375(VarNext)| -v3368(VarCurr).
% 94.22/93.63  0 [] -range_2_0(B)| -v3309(constB0,B)|$T.
% 94.22/93.63  0 [] -range_2_0(B)|v3309(constB0,B)| -$T.
% 94.22/93.63  0 [] -v3368(VarCurr)|v288(VarCurr).
% 94.22/93.63  0 [] v3368(VarCurr)| -v288(VarCurr).
% 94.22/93.63  0 [] -v3361(VarCurr)|v3363(VarCurr).
% 94.22/93.63  0 [] -v3361(VarCurr)|v3366(VarCurr).
% 94.22/93.63  0 [] v3361(VarCurr)| -v3363(VarCurr)| -v3366(VarCurr).
% 94.22/93.63  0 [] v3366(VarCurr)|v3315(VarCurr).
% 94.22/93.63  0 [] -v3366(VarCurr)| -v3315(VarCurr).
% 94.22/93.63  0 [] -v3363(VarCurr)|v3365(VarCurr)|v97(VarCurr,bitIndex2).
% 94.22/93.63  0 [] v3363(VarCurr)| -v3365(VarCurr).
% 94.22/93.63  0 [] v3363(VarCurr)| -v97(VarCurr,bitIndex2).
% 94.22/93.63  0 [] -v3365(VarCurr)|v97(VarCurr,bitIndex0)|v97(VarCurr,bitIndex1).
% 94.22/93.63  0 [] v3365(VarCurr)| -v97(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v3365(VarCurr)| -v97(VarCurr,bitIndex1).
% 94.22/93.63  0 [] -v3313(VarCurr)|v3356(VarCurr).
% 94.22/93.63  0 [] -v3313(VarCurr)|v3359(VarCurr).
% 94.22/93.63  0 [] v3313(VarCurr)| -v3356(VarCurr)| -v3359(VarCurr).
% 94.22/93.63  0 [] v3359(VarCurr)|v3315(VarCurr).
% 94.22/93.63  0 [] -v3359(VarCurr)| -v3315(VarCurr).
% 94.22/93.63  0 [] -v3356(VarCurr)|v3358(VarCurr)|v95(VarCurr,bitIndex2).
% 94.22/93.63  0 [] v3356(VarCurr)| -v3358(VarCurr).
% 94.22/93.63  0 [] v3356(VarCurr)| -v95(VarCurr,bitIndex2).
% 94.22/93.63  0 [] -v3358(VarCurr)|v95(VarCurr,bitIndex0)|v95(VarCurr,bitIndex1).
% 94.22/93.63  0 [] v3358(VarCurr)| -v95(VarCurr,bitIndex0).
% 94.22/93.63  0 [] v3358(VarCurr)| -v95(VarCurr,bitIndex1).
% 94.22/93.63  0 [] -v3315(VarCurr)|v3317(VarCurr).
% 94.22/93.63  0 [] v3315(VarCurr)| -v3317(VarCurr).
% 94.22/93.63  0 [] -v3317(VarCurr)|v3319(VarCurr).
% 94.22/93.63  0 [] v3317(VarCurr)| -v3319(VarCurr).
% 94.22/93.63  0 [] -v3319(VarCurr)|v3350(VarCurr)|v38(VarCurr).
% 94.22/93.63  0 [] v3319(VarCurr)| -v3350(VarCurr).
% 94.22/93.63  0 [] v3319(VarCurr)| -v38(VarCurr).
% 94.22/93.63  0 [] -v3350(VarCurr)|v3351(VarCurr)|v36(VarCurr,bitIndex11).
% 94.22/93.63  0 [] v3350(VarCurr)| -v3351(VarCurr).
% 94.22/93.63  0 [] v3350(VarCurr)| -v36(VarCurr,bitIndex11).
% 94.22/93.63  0 [] -v3351(VarCurr)|v3352(VarCurr)|v36(VarCurr,bitIndex10).
% 94.22/93.63  0 [] v3351(VarCurr)| -v3352(VarCurr).
% 94.22/93.63  0 [] v3351(VarCurr)| -v36(VarCurr,bitIndex10).
% 94.22/93.63  0 [] -v3352(VarCurr)|v3353(VarCurr)|v36(VarCurr,bitIndex9).
% 94.22/93.63  0 [] v3352(VarCurr)| -v3353(VarCurr).
% 94.22/93.63  0 [] v3352(VarCurr)| -v36(VarCurr,bitIndex9).
% 94.22/93.63  0 [] -v3353(VarCurr)|v3354(VarCurr)|v36(VarCurr,bitIndex8).
% 94.22/93.63  0 [] v3353(VarCurr)| -v3354(VarCurr).
% 94.22/93.63  0 [] v3353(VarCurr)| -v36(VarCurr,bitIndex8).
% 94.22/93.63  0 [] -v3354(VarCurr)|v36(VarCurr,bitIndex2)|v36(VarCurr,bitIndex5).
% 94.22/93.63  0 [] v3354(VarCurr)| -v36(VarCurr,bitIndex2).
% 94.22/93.63  0 [] v3354(VarCurr)| -v36(VarCurr,bitIndex5).
% 94.22/93.63  0 [] v3331(VarCurr)| -v36(VarCurr,bitIndex10)|$F.
% 94.22/93.63  0 [] v3331(VarCurr)|v36(VarCurr,bitIndex10)| -$F.
% 94.22/93.63  0 [] -v3331(VarCurr)| -v36(VarCurr,bitIndex10)|$T.
% 94.22/93.63  0 [] -v3331(VarCurr)|v36(VarCurr,bitIndex10)| -$T.
% 94.22/93.63  0 [] -v3331(VarCurr)|v3332(VarCurr)|v3347(VarCurr).
% 94.22/93.63  0 [] v3331(VarCurr)| -v3332(VarCurr).
% 94.22/93.63  0 [] v3331(VarCurr)| -v3347(VarCurr).
% 94.22/93.63  0 [] -v3347(VarCurr)|v38(VarCurr).
% 94.22/93.63  0 [] -v3347(VarCurr)|v3348(VarCurr).
% 94.22/93.63  0 [] v3347(VarCurr)| -v38(VarCurr)| -v3348(VarCurr).
% 94.22/93.63  0 [] -v3348(VarCurr)| -$T|v31(VarCurr,bitIndex10).
% 94.22/93.63  0 [] -v3348(VarCurr)|$T| -v31(VarCurr,bitIndex10).
% 94.22/93.63  0 [] v3348(VarCurr)|$T|v31(VarCurr,bitIndex10).
% 94.22/93.63  0 [] v3348(VarCurr)| -$T| -v31(VarCurr,bitIndex10).
% 94.22/93.63  0 [] -v3332(VarCurr)|v3333(VarCurr)|v3343(VarCurr).
% 94.22/93.63  0 [] v3332(VarCurr)| -v3333(VarCurr).
% 94.22/93.63  0 [] v3332(VarCurr)| -v3343(VarCurr).
% 94.22/93.63  0 [] -v3343(VarCurr)|v3344(VarCurr).
% 94.22/93.63  0 [] -v3343(VarCurr)|v1323(VarCurr).
% 94.22/93.63  0 [] v3343(VarCurr)| -v3344(VarCurr)| -v1323(VarCurr).
% 94.22/93.63  0 [] -v3344(VarCurr)|v3346(VarCurr).
% 94.22/93.63  0 [] -v3344(VarCurr)|v1682(VarCurr).
% 94.22/93.63  0 [] v3344(VarCurr)| -v3346(VarCurr)| -v1682(VarCurr).
% 94.22/93.63  0 [] -v3346(VarCurr)|v1678(VarCurr).
% 94.22/93.63  0 [] -v3346(VarCurr)|v1162(VarCurr).
% 94.22/93.63  0 [] v3346(VarCurr)| -v1678(VarCurr)| -v1162(VarCurr).
% 94.22/93.63  0 [] -v3333(VarCurr)|v3334(VarCurr)|v3341(VarCurr).
% 94.22/93.63  0 [] v3333(VarCurr)| -v3334(VarCurr).
% 94.22/93.63  0 [] v3333(VarCurr)| -v3341(VarCurr).
% 94.22/93.63  0 [] -v3341(VarCurr)|v3342(VarCurr).
% 94.22/93.63  0 [] -v3341(VarCurr)|v1300(VarCurr).
% 94.22/93.63  0 [] v3341(VarCurr)| -v3342(VarCurr)| -v1300(VarCurr).
% 94.22/93.63  0 [] -v3342(VarCurr)|v3338(VarCurr).
% 94.22/93.63  0 [] -v3342(VarCurr)|v1682(VarCurr).
% 94.22/93.63  0 [] v3342(VarCurr)| -v3338(VarCurr)| -v1682(VarCurr).
% 94.22/93.63  0 [] -v3334(VarCurr)|v3335(VarCurr)|v3339(VarCurr).
% 94.22/93.64  0 [] v3334(VarCurr)| -v3335(VarCurr).
% 94.22/93.64  0 [] v3334(VarCurr)| -v3339(VarCurr).
% 94.22/93.64  0 [] -v3339(VarCurr)|v3340(VarCurr).
% 94.22/93.64  0 [] -v3339(VarCurr)|v1278(VarCurr).
% 94.22/93.64  0 [] v3339(VarCurr)| -v3340(VarCurr)| -v1278(VarCurr).
% 94.22/93.64  0 [] -v3340(VarCurr)|v3338(VarCurr).
% 94.22/93.64  0 [] -v3340(VarCurr)|v1682(VarCurr).
% 94.22/93.64  0 [] v3340(VarCurr)| -v3338(VarCurr)| -v1682(VarCurr).
% 94.22/93.64  0 [] -v3335(VarCurr)|v3336(VarCurr).
% 94.22/93.64  0 [] -v3335(VarCurr)|v1238(VarCurr).
% 94.22/93.64  0 [] v3335(VarCurr)| -v3336(VarCurr)| -v1238(VarCurr).
% 94.22/93.64  0 [] -v3336(VarCurr)|v3338(VarCurr).
% 94.22/93.64  0 [] -v3336(VarCurr)|v1682(VarCurr).
% 94.22/93.64  0 [] v3336(VarCurr)| -v3338(VarCurr)| -v1682(VarCurr).
% 94.22/93.64  0 [] -v3338(VarCurr)|v1690(VarCurr).
% 94.22/93.64  0 [] -v3338(VarCurr)|v1162(VarCurr).
% 94.22/93.64  0 [] v3338(VarCurr)| -v1690(VarCurr)| -v1162(VarCurr).
% 94.22/93.64  0 [] -v31(VarNext,bitIndex10)|v3323(VarNext,bitIndex9).
% 94.22/93.64  0 [] v31(VarNext,bitIndex10)| -v3323(VarNext,bitIndex9).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3323(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)|v3323(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.22/93.64  0 [] -v3324(VarNext)| -range_10_0(B)| -v3323(VarNext,B)|v1253(VarNext,B).
% 94.22/93.64  0 [] -v3324(VarNext)| -range_10_0(B)|v3323(VarNext,B)| -v1253(VarNext,B).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3324(VarNext)|v3325(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3324(VarNext)| -v3325(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3325(VarNext)|v3327(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3325(VarNext)|v1240(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3325(VarNext)| -v3327(VarNext)| -v1240(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3327(VarNext)|v1247(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3327(VarNext)| -v1247(VarNext).
% 94.22/93.64  0 [] -v3311(VarCurr)|v12(VarCurr).
% 94.22/93.64  0 [] v3311(VarCurr)| -v12(VarCurr).
% 94.22/93.64  0 [] -v97(VarCurr,bitIndex0)|v3301(VarCurr).
% 94.22/93.64  0 [] v97(VarCurr,bitIndex0)| -v3301(VarCurr).
% 94.22/93.64  0 [] -v97(VarCurr,bitIndex1)|v308(VarCurr).
% 94.22/93.64  0 [] v97(VarCurr,bitIndex1)| -v308(VarCurr).
% 94.22/93.64  0 [] -v97(VarCurr,bitIndex2)|v99(VarCurr).
% 94.22/93.64  0 [] v97(VarCurr,bitIndex2)| -v99(VarCurr).
% 94.22/93.64  0 [] -v3301(VarCurr)|v3303(VarCurr).
% 94.22/93.64  0 [] v3301(VarCurr)| -v3303(VarCurr).
% 94.22/93.64  0 [] -v3303(VarCurr)|v3305(VarCurr).
% 94.22/93.64  0 [] -v3303(VarCurr)|v3306(VarCurr).
% 94.22/93.64  0 [] v3303(VarCurr)| -v3305(VarCurr)| -v3306(VarCurr).
% 94.22/93.64  0 [] -v3306(VarCurr)|v1162(VarCurr)|v907(VarCurr).
% 94.22/93.64  0 [] v3306(VarCurr)| -v1162(VarCurr).
% 94.22/93.64  0 [] v3306(VarCurr)| -v907(VarCurr).
% 94.22/93.64  0 [] v3305(VarCurr)|v1031(VarCurr).
% 94.22/93.64  0 [] -v3305(VarCurr)| -v1031(VarCurr).
% 94.22/93.64  0 [] -v308(VarCurr)|v310(VarCurr).
% 94.22/93.64  0 [] v308(VarCurr)| -v310(VarCurr).
% 94.22/93.64  0 [] v310(VarCurr)|v312(VarCurr).
% 94.22/93.64  0 [] -v310(VarCurr)| -v312(VarCurr).
% 94.22/93.64  0 [] -v312(VarCurr)|v314(VarCurr).
% 94.22/93.64  0 [] v312(VarCurr)| -v314(VarCurr).
% 94.22/93.64  0 [] -v314(VarCurr)|v316(VarCurr)|v3201(VarCurr).
% 94.22/93.64  0 [] v314(VarCurr)| -v316(VarCurr).
% 94.22/93.64  0 [] v314(VarCurr)| -v3201(VarCurr).
% 94.22/93.64  0 [] -v3201(VarCurr)|v3203(VarCurr).
% 94.22/93.64  0 [] v3201(VarCurr)| -v3203(VarCurr).
% 94.22/93.64  0 [] -v3203(VarCurr)| -v3205(VarCurr,bitIndex4)|$F.
% 94.22/93.64  0 [] -v3203(VarCurr)|v3205(VarCurr,bitIndex4)| -$F.
% 94.22/93.64  0 [] -v3203(VarCurr)| -v3205(VarCurr,bitIndex3)|$F.
% 94.22/93.64  0 [] -v3203(VarCurr)|v3205(VarCurr,bitIndex3)| -$F.
% 94.22/93.64  0 [] -v3203(VarCurr)| -v3205(VarCurr,bitIndex2)|$F.
% 94.22/93.64  0 [] -v3203(VarCurr)|v3205(VarCurr,bitIndex2)| -$F.
% 94.22/93.64  0 [] -v3203(VarCurr)| -v3205(VarCurr,bitIndex1)|$F.
% 94.22/93.64  0 [] -v3203(VarCurr)|v3205(VarCurr,bitIndex1)| -$F.
% 94.22/93.64  0 [] -v3203(VarCurr)| -v3205(VarCurr,bitIndex0)|$F.
% 94.22/93.64  0 [] -v3203(VarCurr)|v3205(VarCurr,bitIndex0)| -$F.
% 94.22/93.64  0 [] v3203(VarCurr)|v3205(VarCurr,bitIndex4)|$F|v3205(VarCurr,bitIndex3)|v3205(VarCurr,bitIndex2)|v3205(VarCurr,bitIndex1)|v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] v3203(VarCurr)| -v3205(VarCurr,bitIndex4)| -$F| -v3205(VarCurr,bitIndex3)| -v3205(VarCurr,bitIndex2)| -v3205(VarCurr,bitIndex1)| -v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3285(VarNext)| -range_4_0(B)| -v3205(VarNext,B)|v3205(VarCurr,B).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3285(VarNext)| -range_4_0(B)|v3205(VarNext,B)| -v3205(VarCurr,B).
% 94.22/93.64  0 [] -v3285(VarNext)| -range_4_0(B)| -v3205(VarNext,B)|v3293(VarNext,B).
% 94.22/93.64  0 [] -v3285(VarNext)| -range_4_0(B)|v3205(VarNext,B)| -v3293(VarNext,B).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v3293(VarNext,B)|v3291(VarCurr,B).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v3293(VarNext,B)| -v3291(VarCurr,B).
% 94.22/93.64  0 [] v3294(VarCurr)| -range_4_0(B)| -v3291(VarCurr,B)|v3207(VarCurr,B).
% 94.22/93.64  0 [] v3294(VarCurr)| -range_4_0(B)|v3291(VarCurr,B)| -v3207(VarCurr,B).
% 94.22/93.64  0 [] -v3294(VarCurr)| -range_4_0(B)| -v3291(VarCurr,B)|$F.
% 94.22/93.64  0 [] -v3294(VarCurr)| -range_4_0(B)|v3291(VarCurr,B)| -$F.
% 94.22/93.64  0 [] v3294(VarCurr)|v754(VarCurr).
% 94.22/93.64  0 [] -v3294(VarCurr)| -v754(VarCurr).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3285(VarNext)|v3286(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3285(VarNext)| -v3286(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3286(VarNext)|v3287(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3286(VarNext)|v751(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3286(VarNext)| -v3287(VarNext)| -v751(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)|v3287(VarNext)|v823(VarNext).
% 94.22/93.64  0 [] -nextState(VarCurr,VarNext)| -v3287(VarNext)| -v823(VarNext).
% 94.22/93.64  0 [] v3209(VarCurr)|v3211(VarCurr)|v3252(VarCurr)| -range_4_0(B)| -v3207(VarCurr,B)|v3205(VarCurr,B).
% 94.22/93.64  0 [] v3209(VarCurr)|v3211(VarCurr)|v3252(VarCurr)| -range_4_0(B)|v3207(VarCurr,B)| -v3205(VarCurr,B).
% 94.22/93.64  0 [] -v3252(VarCurr)| -range_4_0(B)| -v3207(VarCurr,B)|v3254(VarCurr,B).
% 94.22/93.64  0 [] -v3252(VarCurr)| -range_4_0(B)|v3207(VarCurr,B)| -v3254(VarCurr,B).
% 94.22/93.64  0 [] -v3211(VarCurr)| -range_4_0(B)| -v3207(VarCurr,B)|v3213(VarCurr,B).
% 94.22/93.64  0 [] -v3211(VarCurr)| -range_4_0(B)|v3207(VarCurr,B)| -v3213(VarCurr,B).
% 94.22/93.64  0 [] -v3209(VarCurr)| -range_4_0(B)| -v3207(VarCurr,B)|v3205(VarCurr,B).
% 94.22/93.64  0 [] -v3209(VarCurr)| -range_4_0(B)|v3207(VarCurr,B)| -v3205(VarCurr,B).
% 94.22/93.64  0 [] -v3281(VarCurr)| -v3282(VarCurr,bitIndex1)|$T.
% 94.22/93.64  0 [] -v3281(VarCurr)|v3282(VarCurr,bitIndex1)| -$T.
% 94.22/93.64  0 [] -v3281(VarCurr)| -v3282(VarCurr,bitIndex0)|$T.
% 94.22/93.64  0 [] -v3281(VarCurr)|v3282(VarCurr,bitIndex0)| -$T.
% 94.22/93.64  0 [] v3281(VarCurr)|v3282(VarCurr,bitIndex1)|$T|v3282(VarCurr,bitIndex0).
% 94.22/93.64  0 [] v3281(VarCurr)| -v3282(VarCurr,bitIndex1)| -$T| -v3282(VarCurr,bitIndex0).
% 94.22/93.64  0 [] -v3282(VarCurr,bitIndex0)|v873(VarCurr).
% 94.22/93.64  0 [] v3282(VarCurr,bitIndex0)| -v873(VarCurr).
% 94.22/93.64  0 [] -v3282(VarCurr,bitIndex1)|v783(VarCurr).
% 94.22/93.64  0 [] v3282(VarCurr,bitIndex1)| -v783(VarCurr).
% 94.22/93.64  0 [] v3255(VarCurr)| -range_4_0(B)| -v3254(VarCurr,B)|v3256(VarCurr,B).
% 94.22/93.64  0 [] v3255(VarCurr)| -range_4_0(B)|v3254(VarCurr,B)| -v3256(VarCurr,B).
% 94.22/93.64  0 [] -v3255(VarCurr)| -range_4_0(B)| -v3254(VarCurr,B)|b10000(B).
% 94.22/93.64  0 [] -v3255(VarCurr)| -range_4_0(B)|v3254(VarCurr,B)| -b10000(B).
% 94.22/93.64  0 [] -v3256(VarCurr,bitIndex0)|v3278(VarCurr).
% 94.22/93.64  0 [] v3256(VarCurr,bitIndex0)| -v3278(VarCurr).
% 94.22/93.64  0 [] -v3256(VarCurr,bitIndex1)|v3276(VarCurr).
% 94.22/93.64  0 [] v3256(VarCurr,bitIndex1)| -v3276(VarCurr).
% 94.22/93.64  0 [] -v3256(VarCurr,bitIndex2)|v3271(VarCurr).
% 94.22/93.64  0 [] v3256(VarCurr,bitIndex2)| -v3271(VarCurr).
% 94.22/93.64  0 [] -v3256(VarCurr,bitIndex3)|v3266(VarCurr).
% 94.22/93.64  0 [] v3256(VarCurr,bitIndex3)| -v3266(VarCurr).
% 94.22/93.64  0 [] -v3256(VarCurr,bitIndex4)|v3258(VarCurr).
% 94.22/93.64  0 [] v3256(VarCurr,bitIndex4)| -v3258(VarCurr).
% 94.22/93.64  0 [] -v3276(VarCurr)|v3277(VarCurr).
% 94.22/93.64  0 [] -v3276(VarCurr)|v3280(VarCurr).
% 94.22/93.64  0 [] v3276(VarCurr)| -v3277(VarCurr)| -v3280(VarCurr).
% 94.22/93.64  0 [] -v3280(VarCurr)|v3205(VarCurr,bitIndex0)|v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] v3280(VarCurr)| -v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] v3280(VarCurr)| -v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] -v3277(VarCurr)|v3278(VarCurr)|v3279(VarCurr).
% 94.22/93.64  0 [] v3277(VarCurr)| -v3278(VarCurr).
% 94.22/93.64  0 [] v3277(VarCurr)| -v3279(VarCurr).
% 94.22/93.64  0 [] v3279(VarCurr)|v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] -v3279(VarCurr)| -v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] v3278(VarCurr)|v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] -v3278(VarCurr)| -v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] -v3271(VarCurr)|v3272(VarCurr).
% 94.22/93.64  0 [] -v3271(VarCurr)|v3275(VarCurr).
% 94.22/93.64  0 [] v3271(VarCurr)| -v3272(VarCurr)| -v3275(VarCurr).
% 94.22/93.64  0 [] -v3275(VarCurr)|v3263(VarCurr)|v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] v3275(VarCurr)| -v3263(VarCurr).
% 94.22/93.64  0 [] v3275(VarCurr)| -v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] -v3272(VarCurr)|v3273(VarCurr)|v3274(VarCurr).
% 94.22/93.64  0 [] v3272(VarCurr)| -v3273(VarCurr).
% 94.22/93.64  0 [] v3272(VarCurr)| -v3274(VarCurr).
% 94.22/93.64  0 [] v3274(VarCurr)|v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] -v3274(VarCurr)| -v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] v3273(VarCurr)|v3263(VarCurr).
% 94.22/93.64  0 [] -v3273(VarCurr)| -v3263(VarCurr).
% 94.22/93.64  0 [] -v3266(VarCurr)|v3267(VarCurr).
% 94.22/93.64  0 [] -v3266(VarCurr)|v3270(VarCurr).
% 94.22/93.64  0 [] v3266(VarCurr)| -v3267(VarCurr)| -v3270(VarCurr).
% 94.22/93.64  0 [] -v3270(VarCurr)|v3262(VarCurr)|v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] v3270(VarCurr)| -v3262(VarCurr).
% 94.22/93.64  0 [] v3270(VarCurr)| -v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] -v3267(VarCurr)|v3268(VarCurr)|v3269(VarCurr).
% 94.22/93.64  0 [] v3267(VarCurr)| -v3268(VarCurr).
% 94.22/93.64  0 [] v3267(VarCurr)| -v3269(VarCurr).
% 94.22/93.64  0 [] v3269(VarCurr)|v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] -v3269(VarCurr)| -v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] v3268(VarCurr)|v3262(VarCurr).
% 94.22/93.64  0 [] -v3268(VarCurr)| -v3262(VarCurr).
% 94.22/93.64  0 [] -v3258(VarCurr)|v3259(VarCurr).
% 94.22/93.64  0 [] -v3258(VarCurr)|v3265(VarCurr).
% 94.22/93.64  0 [] v3258(VarCurr)| -v3259(VarCurr)| -v3265(VarCurr).
% 94.22/93.64  0 [] -v3265(VarCurr)|v3261(VarCurr)|v3205(VarCurr,bitIndex4).
% 94.22/93.64  0 [] v3265(VarCurr)| -v3261(VarCurr).
% 94.22/93.64  0 [] v3265(VarCurr)| -v3205(VarCurr,bitIndex4).
% 94.22/93.64  0 [] -v3259(VarCurr)|v3260(VarCurr)|v3264(VarCurr).
% 94.22/93.64  0 [] v3259(VarCurr)| -v3260(VarCurr).
% 94.22/93.64  0 [] v3259(VarCurr)| -v3264(VarCurr).
% 94.22/93.64  0 [] v3264(VarCurr)|v3205(VarCurr,bitIndex4).
% 94.22/93.64  0 [] -v3264(VarCurr)| -v3205(VarCurr,bitIndex4).
% 94.22/93.64  0 [] v3260(VarCurr)|v3261(VarCurr).
% 94.22/93.64  0 [] -v3260(VarCurr)| -v3261(VarCurr).
% 94.22/93.64  0 [] -v3261(VarCurr)|v3262(VarCurr).
% 94.22/93.64  0 [] -v3261(VarCurr)|v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] v3261(VarCurr)| -v3262(VarCurr)| -v3205(VarCurr,bitIndex3).
% 94.22/93.64  0 [] -v3262(VarCurr)|v3263(VarCurr).
% 94.22/93.64  0 [] -v3262(VarCurr)|v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] v3262(VarCurr)| -v3263(VarCurr)| -v3205(VarCurr,bitIndex2).
% 94.22/93.64  0 [] -v3263(VarCurr)|v3205(VarCurr,bitIndex0).
% 94.22/93.64  0 [] -v3263(VarCurr)|v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] v3263(VarCurr)| -v3205(VarCurr,bitIndex0)| -v3205(VarCurr,bitIndex1).
% 94.22/93.64  0 [] -v3255(VarCurr)| -v3205(VarCurr,bitIndex4)|$T.
% 94.22/93.64  0 [] -v3255(VarCurr)|v3205(VarCurr,bitIndex4)| -$T.
% 94.22/93.64  0 [] -v3255(VarCurr)| -v3205(VarCurr,bitIndex3)|$F.
% 94.22/93.64  0 [] -v3255(VarCurr)|v3205(VarCurr,bitIndex3)| -$F.
% 94.22/93.64  0 [] -v3255(VarCurr)| -v3205(VarCurr,bitIndex2)|$F.
% 94.22/93.64  0 [] -v3255(VarCurr)|v3205(VarCurr,bitIndex2)| -$F.
% 94.22/93.64  0 [] -v3255(VarCurr)| -v3205(VarCurr,bitIndex1)|$F.
% 94.22/93.64  0 [] -v3255(VarCurr)|v3205(VarCurr,bitIndex1)| -$F.
% 94.22/93.64  0 [] -v3255(VarCurr)| -v3205(VarCurr,bitIndex0)|$F.
% 94.22/93.64  0 [] -v3255(VarCurr)|v3205(VarCurr,bitIndex0)| -$F.
% 94.22/93.64  0 [] v3255(VarCurr)|v3205(VarCurr,bitIndex4)|$T|v3205(VarCurr,bitIndex3)|$F|v3205(VarCurr,bitIndex2)|v3205(VarCurr,bitIndex1)|v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3255(VarCurr)|v3205(VarCurr,bitIndex4)|$T| -v3205(VarCurr,bitIndex3)| -$F| -v3205(VarCurr,bitIndex2)| -v3205(VarCurr,bitIndex1)| -v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3255(VarCurr)| -v3205(VarCurr,bitIndex4)| -$T|v3205(VarCurr,bitIndex3)|$F|v3205(VarCurr,bitIndex2)|v3205(VarCurr,bitIndex1)|v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3255(VarCurr)| -v3205(VarCurr,bitIndex4)| -$T| -v3205(VarCurr,bitIndex3)| -$F| -v3205(VarCurr,bitIndex2)| -v3205(VarCurr,bitIndex1)| -v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3252(VarCurr)| -v3253(VarCurr,bitIndex1)|$T.
% 94.22/93.65  0 [] -v3252(VarCurr)|v3253(VarCurr,bitIndex1)| -$T.
% 94.22/93.65  0 [] -v3252(VarCurr)| -v3253(VarCurr,bitIndex0)|$F.
% 94.22/93.65  0 [] -v3252(VarCurr)|v3253(VarCurr,bitIndex0)| -$F.
% 94.22/93.65  0 [] v3252(VarCurr)|v3253(VarCurr,bitIndex1)|$T|v3253(VarCurr,bitIndex0)|$F.
% 94.22/93.65  0 [] v3252(VarCurr)|v3253(VarCurr,bitIndex1)|$T| -v3253(VarCurr,bitIndex0)| -$F.
% 94.22/93.65  0 [] v3252(VarCurr)| -v3253(VarCurr,bitIndex1)| -$T|v3253(VarCurr,bitIndex0)|$F.
% 94.22/93.65  0 [] v3252(VarCurr)| -v3253(VarCurr,bitIndex1)| -$T| -v3253(VarCurr,bitIndex0)| -$F.
% 94.22/93.65  0 [] -v3253(VarCurr,bitIndex0)|v873(VarCurr).
% 94.22/93.65  0 [] v3253(VarCurr,bitIndex0)| -v873(VarCurr).
% 94.22/93.65  0 [] -v3253(VarCurr,bitIndex1)|v783(VarCurr).
% 94.22/93.65  0 [] v3253(VarCurr,bitIndex1)| -v783(VarCurr).
% 94.22/93.65  0 [] v3214(VarCurr)| -range_31_0(B)| -v3213(VarCurr,B)|v3215(VarCurr,B).
% 94.22/93.65  0 [] v3214(VarCurr)| -range_31_0(B)|v3213(VarCurr,B)| -v3215(VarCurr,B).
% 94.22/93.65  0 [] -v3214(VarCurr)| -range_31_0(B)| -v3213(VarCurr,B)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)| -range_31_0(B)|v3213(VarCurr,B)| -$F.
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex6)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex6)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex7)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex7)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex8)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex8)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex9)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex9)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex10)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex10)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex11)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex11)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex12)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex12)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex13)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex13)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex14)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex14)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex15)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex15)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex16)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex16)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex17)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex17)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex18)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex18)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex19)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex19)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex20)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex20)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex21)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex21)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex22)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex22)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex23)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex23)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex24)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex24)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex25)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex25)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex26)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex26)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex27)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex27)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex28)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex28)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex29)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex29)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex30)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex30)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3215(VarCurr,bitIndex31)|v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3215(VarCurr,bitIndex31)| -v3216(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -range_5_0(B)| -v3215(VarCurr,B)|v3216(VarCurr,B).
% 94.22/93.65  0 [] -range_5_0(B)|v3215(VarCurr,B)| -v3216(VarCurr,B).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex0)|v3250(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex0)| -v3250(VarCurr).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex1)|v3248(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex1)| -v3248(VarCurr).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex2)|v3244(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex2)| -v3244(VarCurr).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex3)|v3240(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex3)| -v3240(VarCurr).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex4)|v3236(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex4)| -v3236(VarCurr).
% 94.22/93.65  0 [] -v3216(VarCurr,bitIndex5)|v3218(VarCurr).
% 94.22/93.65  0 [] v3216(VarCurr,bitIndex5)| -v3218(VarCurr).
% 94.22/93.65  0 [] -v3248(VarCurr)|v3249(VarCurr).
% 94.22/93.65  0 [] -v3248(VarCurr)|v3251(VarCurr).
% 94.22/93.65  0 [] v3248(VarCurr)| -v3249(VarCurr)| -v3251(VarCurr).
% 94.22/93.65  0 [] -v3251(VarCurr)|v3222(VarCurr,bitIndex0)|v3230(VarCurr).
% 94.22/93.65  0 [] v3251(VarCurr)| -v3222(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3251(VarCurr)| -v3230(VarCurr).
% 94.22/93.65  0 [] -v3249(VarCurr)|v3250(VarCurr)|v3222(VarCurr,bitIndex1).
% 94.22/93.65  0 [] v3249(VarCurr)| -v3250(VarCurr).
% 94.22/93.65  0 [] v3249(VarCurr)| -v3222(VarCurr,bitIndex1).
% 94.22/93.65  0 [] v3250(VarCurr)|v3222(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3250(VarCurr)| -v3222(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3244(VarCurr)|v3245(VarCurr).
% 94.22/93.65  0 [] -v3244(VarCurr)|v3247(VarCurr).
% 94.22/93.65  0 [] v3244(VarCurr)| -v3245(VarCurr)| -v3247(VarCurr).
% 94.22/93.65  0 [] -v3247(VarCurr)|v3228(VarCurr)|v3231(VarCurr).
% 94.22/93.65  0 [] v3247(VarCurr)| -v3228(VarCurr).
% 94.22/93.65  0 [] v3247(VarCurr)| -v3231(VarCurr).
% 94.22/93.65  0 [] -v3245(VarCurr)|v3246(VarCurr)|v3222(VarCurr,bitIndex2).
% 94.22/93.65  0 [] v3245(VarCurr)| -v3246(VarCurr).
% 94.22/93.65  0 [] v3245(VarCurr)| -v3222(VarCurr,bitIndex2).
% 94.22/93.65  0 [] v3246(VarCurr)|v3228(VarCurr).
% 94.22/93.65  0 [] -v3246(VarCurr)| -v3228(VarCurr).
% 94.22/93.65  0 [] -v3240(VarCurr)|v3241(VarCurr).
% 94.22/93.65  0 [] -v3240(VarCurr)|v3243(VarCurr).
% 94.22/93.65  0 [] v3240(VarCurr)| -v3241(VarCurr)| -v3243(VarCurr).
% 94.22/93.65  0 [] -v3243(VarCurr)|v3226(VarCurr)|v3232(VarCurr).
% 94.22/93.65  0 [] v3243(VarCurr)| -v3226(VarCurr).
% 94.22/93.65  0 [] v3243(VarCurr)| -v3232(VarCurr).
% 94.22/93.65  0 [] -v3241(VarCurr)|v3242(VarCurr)|v3222(VarCurr,bitIndex3).
% 94.22/93.65  0 [] v3241(VarCurr)| -v3242(VarCurr).
% 94.22/93.65  0 [] v3241(VarCurr)| -v3222(VarCurr,bitIndex3).
% 94.22/93.65  0 [] v3242(VarCurr)|v3226(VarCurr).
% 94.22/93.65  0 [] -v3242(VarCurr)| -v3226(VarCurr).
% 94.22/93.65  0 [] -v3236(VarCurr)|v3237(VarCurr).
% 94.22/93.65  0 [] -v3236(VarCurr)|v3239(VarCurr).
% 94.22/93.65  0 [] v3236(VarCurr)| -v3237(VarCurr)| -v3239(VarCurr).
% 94.22/93.65  0 [] -v3239(VarCurr)|v3224(VarCurr)|v3233(VarCurr).
% 94.22/93.65  0 [] v3239(VarCurr)| -v3224(VarCurr).
% 94.22/93.65  0 [] v3239(VarCurr)| -v3233(VarCurr).
% 94.22/93.65  0 [] -v3237(VarCurr)|v3238(VarCurr)|v3222(VarCurr,bitIndex4).
% 94.22/93.65  0 [] v3237(VarCurr)| -v3238(VarCurr).
% 94.22/93.65  0 [] v3237(VarCurr)| -v3222(VarCurr,bitIndex4).
% 94.22/93.65  0 [] v3238(VarCurr)|v3224(VarCurr).
% 94.22/93.65  0 [] -v3238(VarCurr)| -v3224(VarCurr).
% 94.22/93.65  0 [] -v3218(VarCurr)|v3219(VarCurr).
% 94.22/93.65  0 [] -v3218(VarCurr)|v3234(VarCurr).
% 94.22/93.65  0 [] v3218(VarCurr)| -v3219(VarCurr)| -v3234(VarCurr).
% 94.22/93.65  0 [] -v3234(VarCurr)|v3221(VarCurr)|v3235(VarCurr).
% 94.22/93.65  0 [] v3234(VarCurr)| -v3221(VarCurr).
% 94.22/93.65  0 [] v3234(VarCurr)| -v3235(VarCurr).
% 94.22/93.65  0 [] v3235(VarCurr)|v3222(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3235(VarCurr)| -v3222(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -v3219(VarCurr)|v3220(VarCurr)|v3222(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3219(VarCurr)| -v3220(VarCurr).
% 94.22/93.65  0 [] v3219(VarCurr)| -v3222(VarCurr,bitIndex5).
% 94.22/93.65  0 [] v3220(VarCurr)|v3221(VarCurr).
% 94.22/93.65  0 [] -v3220(VarCurr)| -v3221(VarCurr).
% 94.22/93.65  0 [] -v3221(VarCurr)|v3222(VarCurr,bitIndex4)|v3223(VarCurr).
% 94.22/93.65  0 [] v3221(VarCurr)| -v3222(VarCurr,bitIndex4).
% 94.22/93.65  0 [] v3221(VarCurr)| -v3223(VarCurr).
% 94.22/93.65  0 [] -v3223(VarCurr)|v3224(VarCurr).
% 94.22/93.65  0 [] -v3223(VarCurr)|v3233(VarCurr).
% 94.22/93.65  0 [] v3223(VarCurr)| -v3224(VarCurr)| -v3233(VarCurr).
% 94.22/93.65  0 [] v3233(VarCurr)|v3222(VarCurr,bitIndex4).
% 94.22/93.65  0 [] -v3233(VarCurr)| -v3222(VarCurr,bitIndex4).
% 94.22/93.65  0 [] -v3224(VarCurr)|v3222(VarCurr,bitIndex3)|v3225(VarCurr).
% 94.22/93.65  0 [] v3224(VarCurr)| -v3222(VarCurr,bitIndex3).
% 94.22/93.65  0 [] v3224(VarCurr)| -v3225(VarCurr).
% 94.22/93.65  0 [] -v3225(VarCurr)|v3226(VarCurr).
% 94.22/93.65  0 [] -v3225(VarCurr)|v3232(VarCurr).
% 94.22/93.65  0 [] v3225(VarCurr)| -v3226(VarCurr)| -v3232(VarCurr).
% 94.22/93.65  0 [] v3232(VarCurr)|v3222(VarCurr,bitIndex3).
% 94.22/93.65  0 [] -v3232(VarCurr)| -v3222(VarCurr,bitIndex3).
% 94.22/93.65  0 [] -v3226(VarCurr)|v3222(VarCurr,bitIndex2)|v3227(VarCurr).
% 94.22/93.65  0 [] v3226(VarCurr)| -v3222(VarCurr,bitIndex2).
% 94.22/93.65  0 [] v3226(VarCurr)| -v3227(VarCurr).
% 94.22/93.65  0 [] -v3227(VarCurr)|v3228(VarCurr).
% 94.22/93.65  0 [] -v3227(VarCurr)|v3231(VarCurr).
% 94.22/93.65  0 [] v3227(VarCurr)| -v3228(VarCurr)| -v3231(VarCurr).
% 94.22/93.65  0 [] v3231(VarCurr)|v3222(VarCurr,bitIndex2).
% 94.22/93.65  0 [] -v3231(VarCurr)| -v3222(VarCurr,bitIndex2).
% 94.22/93.65  0 [] -v3228(VarCurr)|v3222(VarCurr,bitIndex1)|v3229(VarCurr).
% 94.22/93.65  0 [] v3228(VarCurr)| -v3222(VarCurr,bitIndex1).
% 94.22/93.65  0 [] v3228(VarCurr)| -v3229(VarCurr).
% 94.22/93.65  0 [] -v3229(VarCurr)|v3222(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3229(VarCurr)|v3230(VarCurr).
% 94.22/93.65  0 [] v3229(VarCurr)| -v3222(VarCurr,bitIndex0)| -v3230(VarCurr).
% 94.22/93.65  0 [] v3230(VarCurr)|v3222(VarCurr,bitIndex1).
% 94.22/93.65  0 [] -v3230(VarCurr)| -v3222(VarCurr,bitIndex1).
% 94.22/93.65  0 [] -v3222(VarCurr,bitIndex5).
% 94.22/93.65  0 [] -range_4_0(B)| -v3222(VarCurr,B)|v3205(VarCurr,B).
% 94.22/93.65  0 [] -range_4_0(B)|v3222(VarCurr,B)| -v3205(VarCurr,B).
% 94.22/93.65  0 [] -v3214(VarCurr)| -v3205(VarCurr,bitIndex4)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)|v3205(VarCurr,bitIndex4)| -$F.
% 94.22/93.65  0 [] -v3214(VarCurr)| -v3205(VarCurr,bitIndex3)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)|v3205(VarCurr,bitIndex3)| -$F.
% 94.22/93.65  0 [] -v3214(VarCurr)| -v3205(VarCurr,bitIndex2)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)|v3205(VarCurr,bitIndex2)| -$F.
% 94.22/93.65  0 [] -v3214(VarCurr)| -v3205(VarCurr,bitIndex1)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)|v3205(VarCurr,bitIndex1)| -$F.
% 94.22/93.65  0 [] -v3214(VarCurr)| -v3205(VarCurr,bitIndex0)|$F.
% 94.22/93.65  0 [] -v3214(VarCurr)|v3205(VarCurr,bitIndex0)| -$F.
% 94.22/93.65  0 [] v3214(VarCurr)|v3205(VarCurr,bitIndex4)|$F|v3205(VarCurr,bitIndex3)|v3205(VarCurr,bitIndex2)|v3205(VarCurr,bitIndex1)|v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3214(VarCurr)| -v3205(VarCurr,bitIndex4)| -$F| -v3205(VarCurr,bitIndex3)| -v3205(VarCurr,bitIndex2)| -v3205(VarCurr,bitIndex1)| -v3205(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3211(VarCurr)| -v3212(VarCurr,bitIndex1)|$F.
% 94.22/93.65  0 [] -v3211(VarCurr)|v3212(VarCurr,bitIndex1)| -$F.
% 94.22/93.65  0 [] -v3211(VarCurr)| -v3212(VarCurr,bitIndex0)|$T.
% 94.22/93.65  0 [] -v3211(VarCurr)|v3212(VarCurr,bitIndex0)| -$T.
% 94.22/93.65  0 [] v3211(VarCurr)|v3212(VarCurr,bitIndex1)|$F|v3212(VarCurr,bitIndex0)|$T.
% 94.22/93.65  0 [] v3211(VarCurr)|v3212(VarCurr,bitIndex1)|$F| -v3212(VarCurr,bitIndex0)| -$T.
% 94.22/93.65  0 [] v3211(VarCurr)| -v3212(VarCurr,bitIndex1)| -$F|v3212(VarCurr,bitIndex0)|$T.
% 94.22/93.65  0 [] v3211(VarCurr)| -v3212(VarCurr,bitIndex1)| -$F| -v3212(VarCurr,bitIndex0)| -$T.
% 94.22/93.65  0 [] -v3212(VarCurr,bitIndex0)|v873(VarCurr).
% 94.22/93.65  0 [] v3212(VarCurr,bitIndex0)| -v873(VarCurr).
% 94.22/93.65  0 [] -v3212(VarCurr,bitIndex1)|v783(VarCurr).
% 94.22/93.65  0 [] v3212(VarCurr,bitIndex1)| -v783(VarCurr).
% 94.22/93.65  0 [] -v3205(constB0,bitIndex4).
% 94.22/93.65  0 [] -v3205(constB0,bitIndex3).
% 94.22/93.65  0 [] -v3205(constB0,bitIndex2).
% 94.22/93.65  0 [] -v3205(constB0,bitIndex1).
% 94.22/93.65  0 [] v3205(constB0,bitIndex0).
% 94.22/93.65  0 [] -v3209(VarCurr)| -v3210(VarCurr,bitIndex1)|$F.
% 94.22/93.65  0 [] -v3209(VarCurr)|v3210(VarCurr,bitIndex1)| -$F.
% 94.22/93.65  0 [] -v3209(VarCurr)| -v3210(VarCurr,bitIndex0)|$F.
% 94.22/93.65  0 [] -v3209(VarCurr)|v3210(VarCurr,bitIndex0)| -$F.
% 94.22/93.65  0 [] v3209(VarCurr)|v3210(VarCurr,bitIndex1)|$F|v3210(VarCurr,bitIndex0).
% 94.22/93.65  0 [] v3209(VarCurr)| -v3210(VarCurr,bitIndex1)| -$F| -v3210(VarCurr,bitIndex0).
% 94.22/93.65  0 [] -v3210(VarCurr,bitIndex0)|v873(VarCurr).
% 94.22/93.65  0 [] v3210(VarCurr,bitIndex0)| -v873(VarCurr).
% 94.22/93.65  0 [] -v3210(VarCurr,bitIndex1)|v783(VarCurr).
% 94.22/93.65  0 [] v3210(VarCurr,bitIndex1)| -v783(VarCurr).
% 94.22/93.65  0 [] -v316(VarCurr)|v3195(VarCurr)|v3199(VarCurr).
% 94.22/93.65  0 [] v316(VarCurr)| -v3195(VarCurr).
% 94.22/93.65  0 [] v316(VarCurr)| -v3199(VarCurr).
% 94.22/93.65  0 [] -v3199(VarCurr)|v3103(VarCurr).
% 94.22/93.65  0 [] -v3199(VarCurr)|v3109(VarCurr).
% 94.22/93.65  0 [] v3199(VarCurr)| -v3103(VarCurr)| -v3109(VarCurr).
% 94.22/93.65  0 [] -v3195(VarCurr)|v3196(VarCurr)|v2259(VarCurr).
% 94.22/93.65  0 [] v3195(VarCurr)| -v3196(VarCurr).
% 94.22/93.65  0 [] v3195(VarCurr)| -v2259(VarCurr).
% 94.22/93.65  0 [] -v3196(VarCurr)|v3197(VarCurr).
% 94.22/93.65  0 [] -v3196(VarCurr)|v3198(VarCurr).
% 94.22/93.65  0 [] v3196(VarCurr)| -v3197(VarCurr)| -v3198(VarCurr).
% 94.22/93.65  0 [] v3198(VarCurr)|v1908(VarCurr).
% 94.29/93.66  0 [] -v3198(VarCurr)| -v1908(VarCurr).
% 94.29/93.66  0 [] -v3197(VarCurr)|v318(VarCurr).
% 94.29/93.66  0 [] -v3197(VarCurr)|v664(VarCurr).
% 94.29/93.66  0 [] v3197(VarCurr)| -v318(VarCurr)| -v664(VarCurr).
% 94.29/93.66  0 [] -v3109(VarCurr)|v3111(VarCurr).
% 94.29/93.66  0 [] v3109(VarCurr)| -v3111(VarCurr).
% 94.29/93.66  0 [] -v3111(VarCurr)|v3113(VarCurr).
% 94.29/93.66  0 [] v3111(VarCurr)| -v3113(VarCurr).
% 94.29/93.66  0 [] -v3113(VarCurr)|v3115(VarCurr).
% 94.29/93.66  0 [] v3113(VarCurr)| -v3115(VarCurr).
% 94.29/93.66  0 [] -v3115(VarCurr)|v3117(VarCurr).
% 94.29/93.66  0 [] v3115(VarCurr)| -v3117(VarCurr).
% 94.29/93.66  0 [] -v3117(VarCurr)|v1918(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v3117(VarCurr)| -v1918(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1918(VarCurr,bitIndex1)|v1920(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v1918(VarCurr,bitIndex1)| -v1920(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1920(VarCurr,bitIndex1)|v1922(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v1920(VarCurr,bitIndex1)| -v1922(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1922(VarCurr,bitIndex1)|v1924(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v1922(VarCurr,bitIndex1)| -v1924(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1924(VarCurr,bitIndex1)|v1926(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v1924(VarCurr,bitIndex1)| -v1926(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1926(VarCurr,bitIndex1)|v1928(VarCurr,bitIndex1).
% 94.29/93.66  0 [] v1926(VarCurr,bitIndex1)| -v1928(VarCurr,bitIndex1).
% 94.29/93.66  0 [] -v1928(VarCurr,bitIndex1)|v3119(VarCurr).
% 94.29/93.66  0 [] v1928(VarCurr,bitIndex1)| -v3119(VarCurr).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)|v3150(VarNext)| -v3119(VarNext)|v3119(VarCurr).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)|v3150(VarNext)|v3119(VarNext)| -v3119(VarCurr).
% 94.29/93.66  0 [] -v3150(VarNext)| -v3119(VarNext)|v3185(VarNext).
% 94.29/93.66  0 [] -v3150(VarNext)|v3119(VarNext)| -v3185(VarNext).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)| -v3185(VarNext)|v3183(VarCurr).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)|v3185(VarNext)| -v3183(VarCurr).
% 94.29/93.66  0 [] v3121(VarCurr)| -v3183(VarCurr)|v3186(VarCurr).
% 94.29/93.66  0 [] v3121(VarCurr)|v3183(VarCurr)| -v3186(VarCurr).
% 94.29/93.66  0 [] -v3121(VarCurr)| -v3183(VarCurr)|v3123(VarCurr).
% 94.29/93.66  0 [] -v3121(VarCurr)|v3183(VarCurr)| -v3123(VarCurr).
% 94.29/93.66  0 [] v3163(VarCurr)| -v3186(VarCurr)|v3145(VarCurr).
% 94.29/93.66  0 [] v3163(VarCurr)|v3186(VarCurr)| -v3145(VarCurr).
% 94.29/93.66  0 [] -v3163(VarCurr)| -v3186(VarCurr)|v3187(VarCurr).
% 94.29/93.66  0 [] -v3163(VarCurr)|v3186(VarCurr)| -v3187(VarCurr).
% 94.29/93.66  0 [] v3166(VarCurr)|v3168(VarCurr)| -v3187(VarCurr)|v3191(VarCurr).
% 94.29/93.66  0 [] v3166(VarCurr)|v3168(VarCurr)|v3187(VarCurr)| -v3191(VarCurr).
% 94.29/93.66  0 [] -v3168(VarCurr)| -v3187(VarCurr)|v3190(VarCurr).
% 94.29/93.66  0 [] -v3168(VarCurr)|v3187(VarCurr)| -v3190(VarCurr).
% 94.29/93.66  0 [] -v3166(VarCurr)| -v3187(VarCurr)|v3188(VarCurr).
% 94.29/93.66  0 [] -v3166(VarCurr)|v3187(VarCurr)| -v3188(VarCurr).
% 94.29/93.66  0 [] v3176(VarCurr)| -v3191(VarCurr)|v3145(VarCurr).
% 94.29/93.66  0 [] v3176(VarCurr)|v3191(VarCurr)| -v3145(VarCurr).
% 94.29/93.66  0 [] -v3176(VarCurr)| -v3191(VarCurr)|$T.
% 94.29/93.66  0 [] -v3176(VarCurr)|v3191(VarCurr)| -$T.
% 94.29/93.66  0 [] v3170(VarCurr)| -v3190(VarCurr)|v3145(VarCurr).
% 94.29/93.66  0 [] v3170(VarCurr)|v3190(VarCurr)| -v3145(VarCurr).
% 94.29/93.66  0 [] -v3170(VarCurr)| -v3190(VarCurr)|$F.
% 94.29/93.66  0 [] -v3170(VarCurr)|v3190(VarCurr)| -$F.
% 94.29/93.66  0 [] v3189(VarCurr)| -v3188(VarCurr)|$F.
% 94.29/93.66  0 [] v3189(VarCurr)|v3188(VarCurr)| -$F.
% 94.29/93.66  0 [] -v3189(VarCurr)| -v3188(VarCurr)|$T.
% 94.29/93.66  0 [] -v3189(VarCurr)|v3188(VarCurr)| -$T.
% 94.29/93.66  0 [] -v3189(VarCurr)| -v3131(VarCurr)|$T.
% 94.29/93.66  0 [] -v3189(VarCurr)|v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] v3189(VarCurr)|v3131(VarCurr)|$T.
% 94.29/93.66  0 [] v3189(VarCurr)| -v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)| -v3150(VarNext)|v3151(VarNext).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)| -v3150(VarNext)|v3160(VarNext).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)|v3150(VarNext)| -v3151(VarNext)| -v3160(VarNext).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)| -v3160(VarNext)|v3158(VarCurr).
% 94.29/93.66  0 [] -nextState(VarCurr,VarNext)|v3160(VarNext)| -v3158(VarCurr).
% 94.29/93.66  0 [] -v3158(VarCurr)|v3121(VarCurr)|v3161(VarCurr).
% 94.29/93.66  0 [] v3158(VarCurr)| -v3121(VarCurr).
% 94.29/93.66  0 [] v3158(VarCurr)| -v3161(VarCurr).
% 94.29/93.66  0 [] -v3161(VarCurr)|v3162(VarCurr).
% 94.29/93.66  0 [] -v3161(VarCurr)|v3182(VarCurr).
% 94.29/93.66  0 [] v3161(VarCurr)| -v3162(VarCurr)| -v3182(VarCurr).
% 94.29/93.66  0 [] v3182(VarCurr)|v3121(VarCurr).
% 94.29/93.66  0 [] -v3182(VarCurr)| -v3121(VarCurr).
% 94.29/93.66  0 [] -v3162(VarCurr)|v3163(VarCurr)|v3180(VarCurr).
% 94.29/93.66  0 [] v3162(VarCurr)| -v3163(VarCurr).
% 94.29/93.66  0 [] v3162(VarCurr)| -v3180(VarCurr).
% 94.29/93.66  0 [] -v3180(VarCurr)|v3141(VarCurr).
% 94.29/93.66  0 [] -v3180(VarCurr)|v3181(VarCurr).
% 94.29/93.66  0 [] v3180(VarCurr)| -v3141(VarCurr)| -v3181(VarCurr).
% 94.29/93.66  0 [] v3181(VarCurr)|v3143(VarCurr).
% 94.29/93.66  0 [] -v3181(VarCurr)| -v3143(VarCurr).
% 94.29/93.66  0 [] -v3163(VarCurr)|v3164(VarCurr).
% 94.29/93.66  0 [] -v3163(VarCurr)|v3143(VarCurr).
% 94.29/93.66  0 [] v3163(VarCurr)| -v3164(VarCurr)| -v3143(VarCurr).
% 94.29/93.66  0 [] -v3164(VarCurr)|v3165(VarCurr)|v3174(VarCurr).
% 94.29/93.66  0 [] v3164(VarCurr)| -v3165(VarCurr).
% 94.29/93.66  0 [] v3164(VarCurr)| -v3174(VarCurr).
% 94.29/93.66  0 [] -v3174(VarCurr)|v3175(VarCurr).
% 94.29/93.66  0 [] -v3174(VarCurr)|v3179(VarCurr).
% 94.29/93.66  0 [] v3174(VarCurr)| -v3175(VarCurr)| -v3179(VarCurr).
% 94.29/93.66  0 [] -v3179(VarCurr)| -v3167(VarCurr,bitIndex2)|$F.
% 94.29/93.66  0 [] -v3179(VarCurr)|v3167(VarCurr,bitIndex2)| -$F.
% 94.29/93.66  0 [] -v3179(VarCurr)| -v3167(VarCurr,bitIndex1)|$F.
% 94.29/93.66  0 [] -v3179(VarCurr)|v3167(VarCurr,bitIndex1)| -$F.
% 94.29/93.66  0 [] -v3179(VarCurr)| -v3167(VarCurr,bitIndex0)|$T.
% 94.29/93.66  0 [] -v3179(VarCurr)|v3167(VarCurr,bitIndex0)| -$T.
% 94.29/93.66  0 [] v3179(VarCurr)|v3167(VarCurr,bitIndex2)|$F|v3167(VarCurr,bitIndex1)|v3167(VarCurr,bitIndex0)|$T.
% 94.29/93.66  0 [] v3179(VarCurr)|v3167(VarCurr,bitIndex2)|$F|v3167(VarCurr,bitIndex1)| -v3167(VarCurr,bitIndex0)| -$T.
% 94.29/93.66  0 [] v3179(VarCurr)| -v3167(VarCurr,bitIndex2)| -$F| -v3167(VarCurr,bitIndex1)|v3167(VarCurr,bitIndex0)|$T.
% 94.29/93.66  0 [] v3179(VarCurr)| -v3167(VarCurr,bitIndex2)| -$F| -v3167(VarCurr,bitIndex1)| -v3167(VarCurr,bitIndex0)| -$T.
% 94.29/93.66  0 [] -v3175(VarCurr)|v3176(VarCurr)|v3177(VarCurr).
% 94.29/93.66  0 [] v3175(VarCurr)| -v3176(VarCurr).
% 94.29/93.66  0 [] v3175(VarCurr)| -v3177(VarCurr).
% 94.29/93.66  0 [] -v3177(VarCurr)|v3141(VarCurr).
% 94.29/93.66  0 [] -v3177(VarCurr)|v3178(VarCurr).
% 94.29/93.66  0 [] v3177(VarCurr)| -v3141(VarCurr)| -v3178(VarCurr).
% 94.29/93.66  0 [] v3178(VarCurr)|v3176(VarCurr).
% 94.29/93.66  0 [] -v3178(VarCurr)| -v3176(VarCurr).
% 94.29/93.66  0 [] -v3176(VarCurr)| -v3131(VarCurr)|$T.
% 94.29/93.66  0 [] -v3176(VarCurr)|v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] v3176(VarCurr)|v3131(VarCurr)|$T.
% 94.29/93.66  0 [] v3176(VarCurr)| -v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] -v3165(VarCurr)|v3166(VarCurr)|v3168(VarCurr).
% 94.29/93.66  0 [] v3165(VarCurr)| -v3166(VarCurr).
% 94.29/93.66  0 [] v3165(VarCurr)| -v3168(VarCurr).
% 94.29/93.66  0 [] -v3168(VarCurr)|v3169(VarCurr).
% 94.29/93.66  0 [] -v3168(VarCurr)|v3173(VarCurr).
% 94.29/93.66  0 [] v3168(VarCurr)| -v3169(VarCurr)| -v3173(VarCurr).
% 94.29/93.66  0 [] -v3173(VarCurr)| -v3167(VarCurr,bitIndex2)|$F.
% 94.29/93.66  0 [] -v3173(VarCurr)|v3167(VarCurr,bitIndex2)| -$F.
% 94.29/93.66  0 [] -v3173(VarCurr)| -v3167(VarCurr,bitIndex1)|$T.
% 94.29/93.66  0 [] -v3173(VarCurr)|v3167(VarCurr,bitIndex1)| -$T.
% 94.29/93.66  0 [] -v3173(VarCurr)| -v3167(VarCurr,bitIndex0)|$F.
% 94.29/93.66  0 [] -v3173(VarCurr)|v3167(VarCurr,bitIndex0)| -$F.
% 94.29/93.66  0 [] v3173(VarCurr)|v3167(VarCurr,bitIndex2)|$F|v3167(VarCurr,bitIndex1)|$T|v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3173(VarCurr)|v3167(VarCurr,bitIndex2)|$F| -v3167(VarCurr,bitIndex1)| -$T|v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3173(VarCurr)| -v3167(VarCurr,bitIndex2)| -$F|v3167(VarCurr,bitIndex1)|$T| -v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3173(VarCurr)| -v3167(VarCurr,bitIndex2)| -$F| -v3167(VarCurr,bitIndex1)| -$T| -v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] -v3169(VarCurr)|v3170(VarCurr)|v3171(VarCurr).
% 94.29/93.66  0 [] v3169(VarCurr)| -v3170(VarCurr).
% 94.29/93.66  0 [] v3169(VarCurr)| -v3171(VarCurr).
% 94.29/93.66  0 [] -v3171(VarCurr)|v3141(VarCurr).
% 94.29/93.66  0 [] -v3171(VarCurr)|v3172(VarCurr).
% 94.29/93.66  0 [] v3171(VarCurr)| -v3141(VarCurr)| -v3172(VarCurr).
% 94.29/93.66  0 [] v3172(VarCurr)|v3170(VarCurr).
% 94.29/93.66  0 [] -v3172(VarCurr)| -v3170(VarCurr).
% 94.29/93.66  0 [] -v3170(VarCurr)| -v3131(VarCurr)|$T.
% 94.29/93.66  0 [] -v3170(VarCurr)|v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] v3170(VarCurr)|v3131(VarCurr)|$T.
% 94.29/93.66  0 [] v3170(VarCurr)| -v3131(VarCurr)| -$T.
% 94.29/93.66  0 [] -v3166(VarCurr)| -v3167(VarCurr,bitIndex2)|$T.
% 94.29/93.66  0 [] -v3166(VarCurr)|v3167(VarCurr,bitIndex2)| -$T.
% 94.29/93.66  0 [] -v3166(VarCurr)| -v3167(VarCurr,bitIndex1)|$F.
% 94.29/93.66  0 [] -v3166(VarCurr)|v3167(VarCurr,bitIndex1)| -$F.
% 94.29/93.66  0 [] -v3166(VarCurr)| -v3167(VarCurr,bitIndex0)|$F.
% 94.29/93.66  0 [] -v3166(VarCurr)|v3167(VarCurr,bitIndex0)| -$F.
% 94.29/93.66  0 [] v3166(VarCurr)|v3167(VarCurr,bitIndex2)|$T|v3167(VarCurr,bitIndex1)|$F|v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3166(VarCurr)|v3167(VarCurr,bitIndex2)|$T| -v3167(VarCurr,bitIndex1)| -$F| -v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3166(VarCurr)| -v3167(VarCurr,bitIndex2)| -$T|v3167(VarCurr,bitIndex1)|$F|v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] v3166(VarCurr)| -v3167(VarCurr,bitIndex2)| -$T| -v3167(VarCurr,bitIndex1)| -$F| -v3167(VarCurr,bitIndex0).
% 94.29/93.66  0 [] -v3167(VarCurr,bitIndex0)|v3129(VarCurr).
% 94.29/93.66  0 [] v3167(VarCurr,bitIndex0)| -v3129(VarCurr).
% 94.29/93.66  0 [] -v3167(VarCurr,bitIndex1)|v3127(VarCurr).
% 94.29/93.66  0 [] v3167(VarCurr,bitIndex1)| -v3127(VarCurr).
% 94.29/93.67  0 [] -v3167(VarCurr,bitIndex2)|v3125(VarCurr).
% 94.29/93.67  0 [] v3167(VarCurr,bitIndex2)| -v3125(VarCurr).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3151(VarNext)|v3152(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3151(VarNext)|v3147(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3151(VarNext)| -v3152(VarNext)| -v3147(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3152(VarNext)|v3154(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3152(VarNext)| -v3154(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3154(VarNext)|v3147(VarCurr).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3154(VarNext)| -v3147(VarCurr).
% 94.29/93.67  0 [] -v3147(VarCurr)|v2207(VarCurr).
% 94.29/93.67  0 [] v3147(VarCurr)| -v2207(VarCurr).
% 94.29/93.67  0 [] -v3145(VarCurr)|$F.
% 94.29/93.67  0 [] v3145(VarCurr)| -$F.
% 94.29/93.67  0 [] -v3143(VarCurr)|v2045(VarCurr).
% 94.29/93.67  0 [] v3143(VarCurr)| -v2045(VarCurr).
% 94.29/93.67  0 [] -v3141(VarCurr)|$F.
% 94.29/93.67  0 [] v3141(VarCurr)| -$F.
% 94.29/93.67  0 [] -v3131(VarCurr)|v1966(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v3131(VarCurr)| -v1966(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1966(VarCurr,bitIndex1)|v1968(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1966(VarCurr,bitIndex1)| -v1968(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1968(VarCurr,bitIndex1)|v1970(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1968(VarCurr,bitIndex1)| -v1970(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1970(VarCurr,bitIndex1)|v1972(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1970(VarCurr,bitIndex1)| -v1972(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1972(VarCurr,bitIndex1)|v1974(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1972(VarCurr,bitIndex1)| -v1974(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1974(VarCurr,bitIndex1)|v1976(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1974(VarCurr,bitIndex1)| -v1976(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1976(VarCurr,bitIndex1)|v1978(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1976(VarCurr,bitIndex1)| -v1978(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1978(VarCurr,bitIndex1)|v1980(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1978(VarCurr,bitIndex1)| -v1980(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1980(VarCurr,bitIndex1)|v1982(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1980(VarCurr,bitIndex1)| -v1982(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1982(VarNext,bitIndex1)|v3133(VarNext,bitIndex1).
% 94.29/93.67  0 [] v1982(VarNext,bitIndex1)| -v3133(VarNext,bitIndex1).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3134(VarNext)| -range_63_0(B)| -v3133(VarNext,B)|v1982(VarCurr,B).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3134(VarNext)| -range_63_0(B)|v3133(VarNext,B)| -v1982(VarCurr,B).
% 94.29/93.67  0 [] -v3134(VarNext)| -range_63_0(B)| -v3133(VarNext,B)|v2034(VarNext,B).
% 94.29/93.67  0 [] -v3134(VarNext)| -range_63_0(B)|v3133(VarNext,B)| -v2034(VarNext,B).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3134(VarNext)|v3135(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3134(VarNext)| -v3135(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3135(VarNext)|v3137(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3135(VarNext)|v2013(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3135(VarNext)| -v3137(VarNext)| -v2013(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)|v3137(VarNext)|v2028(VarNext).
% 94.29/93.67  0 [] -nextState(VarCurr,VarNext)| -v3137(VarNext)| -v2028(VarNext).
% 94.29/93.67  0 [] -v1987(VarCurr,bitIndex1)|v1989(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1987(VarCurr,bitIndex1)| -v1989(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1989(VarCurr,bitIndex1)|v1991(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1989(VarCurr,bitIndex1)| -v1991(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1991(VarCurr,bitIndex1)|v1993(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1991(VarCurr,bitIndex1)| -v1993(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1993(VarCurr,bitIndex1)|v1995(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1993(VarCurr,bitIndex1)| -v1995(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1995(VarCurr,bitIndex1)|v1997(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1995(VarCurr,bitIndex1)| -v1997(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1997(VarCurr,bitIndex1)|v1999(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1997(VarCurr,bitIndex1)| -v1999(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v1999(VarCurr,bitIndex1)|v2001(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v1999(VarCurr,bitIndex1)| -v2001(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v2001(VarCurr,bitIndex1)|v2003(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v2001(VarCurr,bitIndex1)| -v2003(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v2003(VarCurr,bitIndex1)|v2005(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v2003(VarCurr,bitIndex1)| -v2005(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v2005(VarCurr,bitIndex1)|v2007(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v2005(VarCurr,bitIndex1)| -v2007(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v2007(VarCurr,bitIndex1)|v2009(VarCurr,bitIndex1).
% 94.29/93.67  0 [] v2007(VarCurr,bitIndex1)| -v2009(VarCurr,bitIndex1).
% 94.29/93.67  0 [] -v3129(VarCurr)|$F.
% 94.29/93.67  0 [] v3129(VarCurr)| -$F.
% 94.31/93.67  0 [] -v3127(VarCurr)|$F.
% 94.31/93.67  0 [] v3127(VarCurr)| -$F.
% 94.31/93.67  0 [] -v3125(VarCurr)|$T.
% 94.31/93.67  0 [] v3125(VarCurr)| -$T.
% 94.31/93.67  0 [] -v3123(VarCurr)|$F.
% 94.31/93.67  0 [] v3123(VarCurr)| -$F.
% 94.31/93.67  0 [] -v3121(VarCurr)|v1934(VarCurr).
% 94.31/93.67  0 [] v3121(VarCurr)| -v1934(VarCurr).
% 94.31/93.67  0 [] -v3103(VarCurr)|v3105(VarCurr).
% 94.31/93.67  0 [] v3103(VarCurr)| -v3105(VarCurr).
% 94.31/93.67  0 [] -v3105(VarCurr)|v3107(VarCurr).
% 94.31/93.67  0 [] v3105(VarCurr)| -v3107(VarCurr).
% 94.31/93.67  0 [] -v2259(VarCurr)|v3094(VarCurr).
% 94.31/93.67  0 [] -v2259(VarCurr)|v1908(VarCurr).
% 94.31/93.67  0 [] v2259(VarCurr)| -v3094(VarCurr)| -v1908(VarCurr).
% 94.31/93.67  0 [] -v3094(VarCurr)|v3095(VarCurr)|v3098(VarCurr).
% 94.31/93.67  0 [] v3094(VarCurr)| -v3095(VarCurr).
% 94.31/93.67  0 [] v3094(VarCurr)| -v3098(VarCurr).
% 94.31/93.67  0 [] -v3098(VarCurr)|v3099(VarCurr).
% 94.31/93.67  0 [] -v3098(VarCurr)|v3100(VarCurr).
% 94.31/93.67  0 [] v3098(VarCurr)| -v3099(VarCurr)| -v3100(VarCurr).
% 94.31/93.67  0 [] -v3100(VarCurr)| -v3101(VarCurr,bitIndex4)|$T.
% 94.31/93.67  0 [] -v3100(VarCurr)|v3101(VarCurr,bitIndex4)| -$T.
% 94.31/93.67  0 [] -v3100(VarCurr)| -v3101(VarCurr,bitIndex3)|$T.
% 94.31/93.67  0 [] -v3100(VarCurr)|v3101(VarCurr,bitIndex3)| -$T.
% 94.31/93.67  0 [] -v3100(VarCurr)| -v3101(VarCurr,bitIndex2)|$T.
% 94.31/93.67  0 [] -v3100(VarCurr)|v3101(VarCurr,bitIndex2)| -$T.
% 94.31/93.67  0 [] -v3100(VarCurr)| -v3101(VarCurr,bitIndex1)|$T.
% 94.31/93.67  0 [] -v3100(VarCurr)|v3101(VarCurr,bitIndex1)| -$T.
% 94.31/93.67  0 [] -v3100(VarCurr)| -v3101(VarCurr,bitIndex0)|$T.
% 94.31/93.67  0 [] -v3100(VarCurr)|v3101(VarCurr,bitIndex0)| -$T.
% 94.31/93.67  0 [] v3100(VarCurr)|v3101(VarCurr,bitIndex4)|$T|v3101(VarCurr,bitIndex3)|v3101(VarCurr,bitIndex2)|v3101(VarCurr,bitIndex1)|v3101(VarCurr,bitIndex0).
% 94.31/93.67  0 [] v3100(VarCurr)| -v3101(VarCurr,bitIndex4)| -$T| -v3101(VarCurr,bitIndex3)| -v3101(VarCurr,bitIndex2)| -v3101(VarCurr,bitIndex1)| -v3101(VarCurr,bitIndex0).
% 94.31/93.67  0 [] -v3101(VarCurr,bitIndex0)|v3054(VarCurr).
% 94.31/93.67  0 [] v3101(VarCurr,bitIndex0)| -v3054(VarCurr).
% 94.31/93.67  0 [] -v3101(VarCurr,bitIndex1)|v3049(VarCurr).
% 94.31/93.67  0 [] v3101(VarCurr,bitIndex1)| -v3049(VarCurr).
% 94.31/93.67  0 [] -v3101(VarCurr,bitIndex2)|v3044(VarCurr).
% 94.31/93.67  0 [] v3101(VarCurr,bitIndex2)| -v3044(VarCurr).
% 94.31/93.67  0 [] -v3101(VarCurr,bitIndex3)|v3039(VarCurr).
% 94.31/93.67  0 [] v3101(VarCurr,bitIndex3)| -v3039(VarCurr).
% 94.31/93.67  0 [] -v3101(VarCurr,bitIndex4)|v3012(VarCurr).
% 94.31/93.67  0 [] v3101(VarCurr,bitIndex4)| -v3012(VarCurr).
% 94.31/93.67  0 [] -v3099(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.67  0 [] -v3099(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.67  0 [] -v3099(VarCurr)| -v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.67  0 [] -v3099(VarCurr)|v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.67  0 [] v3099(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.67  0 [] v3099(VarCurr)|v2261(VarCurr,bitIndex1)|$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.67  0 [] v3099(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.67  0 [] v3099(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.67  0 [] -v3095(VarCurr)|v3096(VarCurr)|v3097(VarCurr).
% 94.31/93.67  0 [] v3095(VarCurr)| -v3096(VarCurr).
% 94.31/93.67  0 [] v3095(VarCurr)| -v3097(VarCurr).
% 94.31/93.67  0 [] -v3097(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.67  0 [] -v3097(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.67  0 [] -v3097(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.67  0 [] -v3097(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.67  0 [] v3097(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0).
% 94.31/93.67  0 [] v3097(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0).
% 94.31/93.67  0 [] -v3096(VarCurr)| -v2261(VarCurr,bitIndex1)|$F.
% 94.31/93.67  0 [] -v3096(VarCurr)|v2261(VarCurr,bitIndex1)| -$F.
% 94.31/93.67  0 [] -v3096(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.67  0 [] -v3096(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.67  0 [] v3096(VarCurr)|v2261(VarCurr,bitIndex1)|$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.67  0 [] v3096(VarCurr)|v2261(VarCurr,bitIndex1)|$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.67  0 [] v3096(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.67  0 [] v3096(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.67  0 [] -nextState(VarCurr,VarNext)|v3079(VarNext)| -range_1_0(B)| -v2261(VarNext,B)|v2261(VarCurr,B).
% 94.31/93.67  0 [] -nextState(VarCurr,VarNext)|v3079(VarNext)| -range_1_0(B)|v2261(VarNext,B)| -v2261(VarCurr,B).
% 94.31/93.67  0 [] -v3079(VarNext)| -range_1_0(B)| -v2261(VarNext,B)|v3087(VarNext,B).
% 94.31/93.67  0 [] -v3079(VarNext)| -range_1_0(B)|v2261(VarNext,B)| -v3087(VarNext,B).
% 94.31/93.67  0 [] -nextState(VarCurr,VarNext)| -range_1_0(B)| -v3087(VarNext,B)|v3085(VarCurr,B).
% 94.31/93.67  0 [] -nextState(VarCurr,VarNext)| -range_1_0(B)|v3087(VarNext,B)| -v3085(VarCurr,B).
% 94.31/93.68  0 [] v3088(VarCurr)| -range_1_0(B)| -v3085(VarCurr,B)|v2263(VarCurr,B).
% 94.31/93.68  0 [] v3088(VarCurr)| -range_1_0(B)|v3085(VarCurr,B)| -v2263(VarCurr,B).
% 94.31/93.68  0 [] -v3088(VarCurr)| -range_1_0(B)| -v3085(VarCurr,B)|$F.
% 94.31/93.68  0 [] -v3088(VarCurr)| -range_1_0(B)|v3085(VarCurr,B)| -$F.
% 94.31/93.68  0 [] -v3088(VarCurr)|v3089(VarCurr)|v3090(VarCurr).
% 94.31/93.68  0 [] v3088(VarCurr)| -v3089(VarCurr).
% 94.31/93.68  0 [] v3088(VarCurr)| -v3090(VarCurr).
% 94.31/93.68  0 [] v3090(VarCurr)|v1908(VarCurr).
% 94.31/93.68  0 [] -v3090(VarCurr)| -v1908(VarCurr).
% 94.31/93.68  0 [] v3089(VarCurr)|v12(VarCurr).
% 94.31/93.68  0 [] -v3089(VarCurr)| -v12(VarCurr).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)| -v3079(VarNext)|v3080(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)|v3079(VarNext)| -v3080(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)| -v3080(VarNext)|v3081(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)| -v3080(VarNext)|v288(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)|v3080(VarNext)| -v3081(VarNext)| -v288(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)|v3081(VarNext)|v1891(VarNext).
% 94.31/93.68  0 [] -nextState(VarCurr,VarNext)| -v3081(VarNext)| -v1891(VarNext).
% 94.31/93.68  0 [] v2988(VarCurr)|v2992(VarCurr)|v3004(VarCurr)| -range_1_0(B)| -v2263(VarCurr,B)|v3058(VarCurr,B).
% 94.31/93.68  0 [] v2988(VarCurr)|v2992(VarCurr)|v3004(VarCurr)| -range_1_0(B)|v2263(VarCurr,B)| -v3058(VarCurr,B).
% 94.31/93.68  0 [] -v3004(VarCurr)| -range_1_0(B)| -v2263(VarCurr,B)|v3005(VarCurr,B).
% 94.31/93.68  0 [] -v3004(VarCurr)| -range_1_0(B)|v2263(VarCurr,B)| -v3005(VarCurr,B).
% 94.31/93.68  0 [] -v2992(VarCurr)| -range_1_0(B)| -v2263(VarCurr,B)|v2993(VarCurr,B).
% 94.31/93.68  0 [] -v2992(VarCurr)| -range_1_0(B)|v2263(VarCurr,B)| -v2993(VarCurr,B).
% 94.31/93.68  0 [] -v2988(VarCurr)| -range_1_0(B)| -v2263(VarCurr,B)|v2989(VarCurr,B).
% 94.31/93.68  0 [] -v2988(VarCurr)| -range_1_0(B)|v2263(VarCurr,B)| -v2989(VarCurr,B).
% 94.31/93.68  0 [] v741(VarCurr)| -range_1_0(B)| -v3058(VarCurr,B)|v3059(VarCurr,B).
% 94.31/93.68  0 [] v741(VarCurr)| -range_1_0(B)|v3058(VarCurr,B)| -v3059(VarCurr,B).
% 94.31/93.68  0 [] -v741(VarCurr)| -range_1_0(B)| -v3058(VarCurr,B)|b01(B).
% 94.31/93.68  0 [] -v741(VarCurr)| -range_1_0(B)|v3058(VarCurr,B)| -b01(B).
% 94.31/93.68  0 [] v3060(VarCurr)| -range_1_0(B)| -v3059(VarCurr,B)|v3061(VarCurr,B).
% 94.31/93.68  0 [] v3060(VarCurr)| -range_1_0(B)|v3059(VarCurr,B)| -v3061(VarCurr,B).
% 94.31/93.68  0 [] -v3060(VarCurr)| -range_1_0(B)| -v3059(VarCurr,B)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)| -range_1_0(B)|v3059(VarCurr,B)| -$F.
% 94.31/93.68  0 [] v3062(VarCurr)| -range_1_0(B)| -v3061(VarCurr,B)|$T.
% 94.31/93.68  0 [] v3062(VarCurr)| -range_1_0(B)|v3061(VarCurr,B)| -$T.
% 94.31/93.68  0 [] -v3062(VarCurr)| -range_1_0(B)| -v3061(VarCurr,B)|b10(B).
% 94.31/93.68  0 [] -v3062(VarCurr)| -range_1_0(B)|v3061(VarCurr,B)| -b10(B).
% 94.31/93.68  0 [] -v3062(VarCurr)|v3064(VarCurr)|v3066(VarCurr).
% 94.31/93.68  0 [] v3062(VarCurr)| -v3064(VarCurr).
% 94.31/93.68  0 [] v3062(VarCurr)| -v3066(VarCurr).
% 94.31/93.68  0 [] -v3066(VarCurr)|v3067(VarCurr).
% 94.31/93.68  0 [] -v3066(VarCurr)|v3065(VarCurr,bitIndex4).
% 94.31/93.68  0 [] v3066(VarCurr)| -v3067(VarCurr)| -v3065(VarCurr,bitIndex4).
% 94.31/93.68  0 [] -v3067(VarCurr)|v3068(VarCurr)|v3069(VarCurr).
% 94.31/93.68  0 [] v3067(VarCurr)| -v3068(VarCurr).
% 94.31/93.68  0 [] v3067(VarCurr)| -v3069(VarCurr).
% 94.31/93.68  0 [] -v3069(VarCurr)|v3070(VarCurr).
% 94.31/93.68  0 [] -v3069(VarCurr)|v3065(VarCurr,bitIndex3).
% 94.31/93.68  0 [] v3069(VarCurr)| -v3070(VarCurr)| -v3065(VarCurr,bitIndex3).
% 94.31/93.68  0 [] -v3070(VarCurr)|v3071(VarCurr)|v3072(VarCurr).
% 94.31/93.68  0 [] v3070(VarCurr)| -v3071(VarCurr).
% 94.31/93.68  0 [] v3070(VarCurr)| -v3072(VarCurr).
% 94.31/93.68  0 [] -v3072(VarCurr)|v3073(VarCurr).
% 94.31/93.68  0 [] -v3072(VarCurr)|v3065(VarCurr,bitIndex2).
% 94.31/93.68  0 [] v3072(VarCurr)| -v3073(VarCurr)| -v3065(VarCurr,bitIndex2).
% 94.31/93.68  0 [] -v3073(VarCurr)|v3074(VarCurr)|v3075(VarCurr).
% 94.31/93.68  0 [] v3073(VarCurr)| -v3074(VarCurr).
% 94.31/93.68  0 [] v3073(VarCurr)| -v3075(VarCurr).
% 94.31/93.68  0 [] -v3075(VarCurr)|v3076(VarCurr).
% 94.31/93.68  0 [] -v3075(VarCurr)|v3065(VarCurr,bitIndex1).
% 94.31/93.68  0 [] v3075(VarCurr)| -v3076(VarCurr)| -v3065(VarCurr,bitIndex1).
% 94.31/93.68  0 [] v3076(VarCurr)|v3065(VarCurr,bitIndex0).
% 94.31/93.68  0 [] -v3076(VarCurr)| -v3065(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3074(VarCurr)|v3065(VarCurr,bitIndex1).
% 94.31/93.68  0 [] -v3074(VarCurr)| -v3065(VarCurr,bitIndex1).
% 94.31/93.68  0 [] v3071(VarCurr)|v3065(VarCurr,bitIndex2).
% 94.31/93.68  0 [] -v3071(VarCurr)| -v3065(VarCurr,bitIndex2).
% 94.31/93.68  0 [] v3068(VarCurr)|v3065(VarCurr,bitIndex3).
% 94.31/93.68  0 [] -v3068(VarCurr)| -v3065(VarCurr,bitIndex3).
% 94.31/93.68  0 [] v3064(VarCurr)|v3065(VarCurr,bitIndex4).
% 94.31/93.68  0 [] -v3064(VarCurr)| -v3065(VarCurr,bitIndex4).
% 94.31/93.68  0 [] -v3065(VarCurr,bitIndex0)|v3054(VarCurr).
% 94.31/93.68  0 [] v3065(VarCurr,bitIndex0)| -v3054(VarCurr).
% 94.31/93.68  0 [] -v3065(VarCurr,bitIndex1)|v3049(VarCurr).
% 94.31/93.68  0 [] v3065(VarCurr,bitIndex1)| -v3049(VarCurr).
% 94.31/93.68  0 [] -v3065(VarCurr,bitIndex2)|v3044(VarCurr).
% 94.31/93.68  0 [] v3065(VarCurr,bitIndex2)| -v3044(VarCurr).
% 94.31/93.68  0 [] -v3065(VarCurr,bitIndex3)|v3039(VarCurr).
% 94.31/93.68  0 [] v3065(VarCurr,bitIndex3)| -v3039(VarCurr).
% 94.31/93.68  0 [] -v3065(VarCurr,bitIndex4)|v3012(VarCurr).
% 94.31/93.68  0 [] v3065(VarCurr,bitIndex4)| -v3012(VarCurr).
% 94.31/93.68  0 [] -v3060(VarCurr)| -v2290(VarCurr,bitIndex4)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)|v2290(VarCurr,bitIndex4)| -$F.
% 94.31/93.68  0 [] -v3060(VarCurr)| -v2290(VarCurr,bitIndex3)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)|v2290(VarCurr,bitIndex3)| -$F.
% 94.31/93.68  0 [] -v3060(VarCurr)| -v2290(VarCurr,bitIndex2)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)|v2290(VarCurr,bitIndex2)| -$F.
% 94.31/93.68  0 [] -v3060(VarCurr)| -v2290(VarCurr,bitIndex1)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)|v2290(VarCurr,bitIndex1)| -$F.
% 94.31/93.68  0 [] -v3060(VarCurr)| -v2290(VarCurr,bitIndex0)|$F.
% 94.31/93.68  0 [] -v3060(VarCurr)|v2290(VarCurr,bitIndex0)| -$F.
% 94.31/93.68  0 [] v3060(VarCurr)|v2290(VarCurr,bitIndex4)|$F|v2290(VarCurr,bitIndex3)|v2290(VarCurr,bitIndex2)|v2290(VarCurr,bitIndex1)|v2290(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3060(VarCurr)| -v2290(VarCurr,bitIndex4)| -$F| -v2290(VarCurr,bitIndex3)| -v2290(VarCurr,bitIndex2)| -v2290(VarCurr,bitIndex1)| -v2290(VarCurr,bitIndex0).
% 94.31/93.68  0 [] -v3057(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.68  0 [] -v3057(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.68  0 [] -v3057(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.68  0 [] -v3057(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.68  0 [] v3057(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3057(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v741(VarCurr)| -range_1_0(B)| -v3005(VarCurr,B)|v3006(VarCurr,B).
% 94.31/93.68  0 [] v741(VarCurr)| -range_1_0(B)|v3005(VarCurr,B)| -v3006(VarCurr,B).
% 94.31/93.68  0 [] -v741(VarCurr)| -range_1_0(B)| -v3005(VarCurr,B)|b01(B).
% 94.31/93.68  0 [] -v741(VarCurr)| -range_1_0(B)|v3005(VarCurr,B)| -b01(B).
% 94.31/93.68  0 [] v3007(VarCurr)| -range_1_0(B)| -v3006(VarCurr,B)|v3008(VarCurr,B).
% 94.31/93.68  0 [] v3007(VarCurr)| -range_1_0(B)|v3006(VarCurr,B)| -v3008(VarCurr,B).
% 94.31/93.68  0 [] -v3007(VarCurr)| -range_1_0(B)| -v3006(VarCurr,B)|$F.
% 94.31/93.68  0 [] -v3007(VarCurr)| -range_1_0(B)|v3006(VarCurr,B)| -$F.
% 94.31/93.68  0 [] v3009(VarCurr)| -range_1_0(B)| -v3008(VarCurr,B)|b10(B).
% 94.31/93.68  0 [] v3009(VarCurr)| -range_1_0(B)|v3008(VarCurr,B)| -b10(B).
% 94.31/93.68  0 [] -v3009(VarCurr)| -range_1_0(B)| -v3008(VarCurr,B)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -range_1_0(B)|v3008(VarCurr,B)| -$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -v3010(VarCurr,bitIndex4)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)|v3010(VarCurr,bitIndex4)| -$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -v3010(VarCurr,bitIndex3)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)|v3010(VarCurr,bitIndex3)| -$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -v3010(VarCurr,bitIndex2)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)|v3010(VarCurr,bitIndex2)| -$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -v3010(VarCurr,bitIndex1)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)|v3010(VarCurr,bitIndex1)| -$T.
% 94.31/93.68  0 [] -v3009(VarCurr)| -v3010(VarCurr,bitIndex0)|$T.
% 94.31/93.68  0 [] -v3009(VarCurr)|v3010(VarCurr,bitIndex0)| -$T.
% 94.31/93.68  0 [] v3009(VarCurr)|v3010(VarCurr,bitIndex4)|$T|v3010(VarCurr,bitIndex3)|v3010(VarCurr,bitIndex2)|v3010(VarCurr,bitIndex1)|v3010(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3009(VarCurr)| -v3010(VarCurr,bitIndex4)| -$T| -v3010(VarCurr,bitIndex3)| -v3010(VarCurr,bitIndex2)| -v3010(VarCurr,bitIndex1)| -v3010(VarCurr,bitIndex0).
% 94.31/93.68  0 [] b11111(bitIndex4).
% 94.31/93.68  0 [] b11111(bitIndex3).
% 94.31/93.68  0 [] b11111(bitIndex2).
% 94.31/93.68  0 [] b11111(bitIndex1).
% 94.31/93.68  0 [] b11111(bitIndex0).
% 94.31/93.68  0 [] -v3010(VarCurr,bitIndex0)|v3054(VarCurr).
% 94.31/93.68  0 [] v3010(VarCurr,bitIndex0)| -v3054(VarCurr).
% 94.31/93.68  0 [] -v3010(VarCurr,bitIndex1)|v3049(VarCurr).
% 94.31/93.68  0 [] v3010(VarCurr,bitIndex1)| -v3049(VarCurr).
% 94.31/93.68  0 [] -v3010(VarCurr,bitIndex2)|v3044(VarCurr).
% 94.31/93.68  0 [] v3010(VarCurr,bitIndex2)| -v3044(VarCurr).
% 94.31/93.68  0 [] -v3010(VarCurr,bitIndex3)|v3039(VarCurr).
% 94.31/93.68  0 [] v3010(VarCurr,bitIndex3)| -v3039(VarCurr).
% 94.31/93.68  0 [] -v3010(VarCurr,bitIndex4)|v3012(VarCurr).
% 94.31/93.68  0 [] v3010(VarCurr,bitIndex4)| -v3012(VarCurr).
% 94.31/93.68  0 [] -v3054(VarCurr)|v3055(VarCurr).
% 94.31/93.68  0 [] -v3054(VarCurr)|v3056(VarCurr).
% 94.31/93.68  0 [] v3054(VarCurr)| -v3055(VarCurr)| -v3056(VarCurr).
% 94.31/93.68  0 [] -v3056(VarCurr)|v2290(VarCurr,bitIndex0)|v2927(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3056(VarCurr)| -v2290(VarCurr,bitIndex0).
% 94.31/93.68  0 [] v3056(VarCurr)| -v2927(VarCurr,bitIndex0).
% 94.31/93.68  0 [] -v3055(VarCurr)|v2898(VarCurr)|v2981(VarCurr).
% 94.31/93.68  0 [] v3055(VarCurr)| -v2898(VarCurr).
% 94.31/93.68  0 [] v3055(VarCurr)| -v2981(VarCurr).
% 94.31/93.68  0 [] -v3049(VarCurr)|v3050(VarCurr).
% 94.31/93.68  0 [] -v3049(VarCurr)|v3053(VarCurr).
% 94.31/93.68  0 [] v3049(VarCurr)| -v3050(VarCurr)| -v3053(VarCurr).
% 94.31/93.68  0 [] -v3053(VarCurr)|v3021(VarCurr)|v3022(VarCurr).
% 94.31/93.68  0 [] v3053(VarCurr)| -v3021(VarCurr).
% 94.31/93.68  0 [] v3053(VarCurr)| -v3022(VarCurr).
% 94.31/93.68  0 [] -v3050(VarCurr)|v3051(VarCurr)|v3052(VarCurr).
% 94.31/93.68  0 [] v3050(VarCurr)| -v3051(VarCurr).
% 94.31/93.68  0 [] v3050(VarCurr)| -v3052(VarCurr).
% 94.31/93.68  0 [] v3052(VarCurr)|v3022(VarCurr).
% 94.31/93.68  0 [] -v3052(VarCurr)| -v3022(VarCurr).
% 94.31/93.68  0 [] v3051(VarCurr)|v3021(VarCurr).
% 94.31/93.68  0 [] -v3051(VarCurr)| -v3021(VarCurr).
% 94.31/93.68  0 [] -v3044(VarCurr)|v3045(VarCurr).
% 94.31/93.68  0 [] -v3044(VarCurr)|v3048(VarCurr).
% 94.31/93.68  0 [] v3044(VarCurr)| -v3045(VarCurr)| -v3048(VarCurr).
% 94.31/93.68  0 [] -v3048(VarCurr)|v3019(VarCurr)|v3026(VarCurr).
% 94.31/93.68  0 [] v3048(VarCurr)| -v3019(VarCurr).
% 94.31/93.68  0 [] v3048(VarCurr)| -v3026(VarCurr).
% 94.31/93.68  0 [] -v3045(VarCurr)|v3046(VarCurr)|v3047(VarCurr).
% 94.31/93.68  0 [] v3045(VarCurr)| -v3046(VarCurr).
% 94.31/93.68  0 [] v3045(VarCurr)| -v3047(VarCurr).
% 94.31/93.68  0 [] v3047(VarCurr)|v3026(VarCurr).
% 94.31/93.68  0 [] -v3047(VarCurr)| -v3026(VarCurr).
% 94.31/93.68  0 [] v3046(VarCurr)|v3019(VarCurr).
% 94.31/93.68  0 [] -v3046(VarCurr)| -v3019(VarCurr).
% 94.31/93.68  0 [] -v3039(VarCurr)|v3040(VarCurr).
% 94.31/93.68  0 [] -v3039(VarCurr)|v3043(VarCurr).
% 94.31/93.68  0 [] v3039(VarCurr)| -v3040(VarCurr)| -v3043(VarCurr).
% 94.31/93.68  0 [] -v3043(VarCurr)|v3017(VarCurr)|v3030(VarCurr).
% 94.31/93.68  0 [] v3043(VarCurr)| -v3017(VarCurr).
% 94.31/93.68  0 [] v3043(VarCurr)| -v3030(VarCurr).
% 94.31/93.68  0 [] -v3040(VarCurr)|v3041(VarCurr)|v3042(VarCurr).
% 94.31/93.68  0 [] v3040(VarCurr)| -v3041(VarCurr).
% 94.31/93.68  0 [] v3040(VarCurr)| -v3042(VarCurr).
% 94.31/93.68  0 [] v3042(VarCurr)|v3030(VarCurr).
% 94.31/93.68  0 [] -v3042(VarCurr)| -v3030(VarCurr).
% 94.31/93.68  0 [] v3041(VarCurr)|v3017(VarCurr).
% 94.31/93.68  0 [] -v3041(VarCurr)| -v3017(VarCurr).
% 94.31/93.68  0 [] -v3012(VarCurr)|v3013(VarCurr).
% 94.31/93.68  0 [] -v3012(VarCurr)|v3038(VarCurr).
% 94.31/93.68  0 [] v3012(VarCurr)| -v3013(VarCurr)| -v3038(VarCurr).
% 94.31/93.68  0 [] -v3038(VarCurr)|v3015(VarCurr)|v3035(VarCurr).
% 94.31/93.68  0 [] v3038(VarCurr)| -v3015(VarCurr).
% 94.31/93.68  0 [] v3038(VarCurr)| -v3035(VarCurr).
% 94.31/93.68  0 [] -v3013(VarCurr)|v3014(VarCurr)|v3034(VarCurr).
% 94.31/93.68  0 [] v3013(VarCurr)| -v3014(VarCurr).
% 94.31/93.68  0 [] v3013(VarCurr)| -v3034(VarCurr).
% 94.31/93.68  0 [] v3034(VarCurr)|v3035(VarCurr).
% 94.31/93.68  0 [] -v3034(VarCurr)| -v3035(VarCurr).
% 94.31/93.68  0 [] -v3035(VarCurr)|v3036(VarCurr).
% 94.31/93.68  0 [] -v3035(VarCurr)|v3037(VarCurr).
% 94.31/93.68  0 [] v3035(VarCurr)| -v3036(VarCurr)| -v3037(VarCurr).
% 94.31/93.68  0 [] -v3037(VarCurr)|v2290(VarCurr,bitIndex4)|v2927(VarCurr,bitIndex4).
% 94.31/93.68  0 [] v3037(VarCurr)| -v2290(VarCurr,bitIndex4).
% 94.31/93.68  0 [] v3037(VarCurr)| -v2927(VarCurr,bitIndex4).
% 94.31/93.68  0 [] -v3036(VarCurr)|v2884(VarCurr)|v2967(VarCurr).
% 94.31/93.68  0 [] v3036(VarCurr)| -v2884(VarCurr).
% 94.31/93.68  0 [] v3036(VarCurr)| -v2967(VarCurr).
% 94.31/93.68  0 [] v3014(VarCurr)|v3015(VarCurr).
% 94.31/93.68  0 [] -v3014(VarCurr)| -v3015(VarCurr).
% 94.31/93.68  0 [] -v3015(VarCurr)|v3016(VarCurr)|v3033(VarCurr).
% 94.31/93.68  0 [] v3015(VarCurr)| -v3016(VarCurr).
% 94.31/93.68  0 [] v3015(VarCurr)| -v3033(VarCurr).
% 94.31/93.68  0 [] -v3033(VarCurr)|v2290(VarCurr,bitIndex3).
% 94.31/93.68  0 [] -v3033(VarCurr)|v2927(VarCurr,bitIndex3).
% 94.31/93.68  0 [] v3033(VarCurr)| -v2290(VarCurr,bitIndex3)| -v2927(VarCurr,bitIndex3).
% 94.31/93.68  0 [] -v3016(VarCurr)|v3017(VarCurr).
% 94.31/93.68  0 [] -v3016(VarCurr)|v3030(VarCurr).
% 94.31/93.68  0 [] v3016(VarCurr)| -v3017(VarCurr)| -v3030(VarCurr).
% 94.31/93.68  0 [] -v3030(VarCurr)|v3031(VarCurr).
% 94.31/93.68  0 [] -v3030(VarCurr)|v3032(VarCurr).
% 94.31/93.68  0 [] v3030(VarCurr)| -v3031(VarCurr)| -v3032(VarCurr).
% 94.31/93.68  0 [] -v3032(VarCurr)|v2290(VarCurr,bitIndex3)|v2927(VarCurr,bitIndex3).
% 94.31/93.68  0 [] v3032(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.68  0 [] v3032(VarCurr)| -v2927(VarCurr,bitIndex3).
% 94.31/93.68  0 [] -v3031(VarCurr)|v2889(VarCurr)|v2972(VarCurr).
% 94.31/93.68  0 [] v3031(VarCurr)| -v2889(VarCurr).
% 94.31/93.68  0 [] v3031(VarCurr)| -v2972(VarCurr).
% 94.31/93.68  0 [] -v3017(VarCurr)|v3018(VarCurr)|v3029(VarCurr).
% 94.31/93.68  0 [] v3017(VarCurr)| -v3018(VarCurr).
% 94.31/93.68  0 [] v3017(VarCurr)| -v3029(VarCurr).
% 94.31/93.68  0 [] -v3029(VarCurr)|v2290(VarCurr,bitIndex2).
% 94.31/93.68  0 [] -v3029(VarCurr)|v2927(VarCurr,bitIndex2).
% 94.31/93.68  0 [] v3029(VarCurr)| -v2290(VarCurr,bitIndex2)| -v2927(VarCurr,bitIndex2).
% 94.31/93.68  0 [] -v3018(VarCurr)|v3019(VarCurr).
% 94.31/93.68  0 [] -v3018(VarCurr)|v3026(VarCurr).
% 94.31/93.68  0 [] v3018(VarCurr)| -v3019(VarCurr)| -v3026(VarCurr).
% 94.31/93.68  0 [] -v3026(VarCurr)|v3027(VarCurr).
% 94.31/93.68  0 [] -v3026(VarCurr)|v3028(VarCurr).
% 94.31/93.68  0 [] v3026(VarCurr)| -v3027(VarCurr)| -v3028(VarCurr).
% 94.31/93.69  0 [] -v3028(VarCurr)|v2290(VarCurr,bitIndex2)|v2927(VarCurr,bitIndex2).
% 94.31/93.69  0 [] v3028(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.69  0 [] v3028(VarCurr)| -v2927(VarCurr,bitIndex2).
% 94.31/93.69  0 [] -v3027(VarCurr)|v2894(VarCurr)|v2977(VarCurr).
% 94.31/93.69  0 [] v3027(VarCurr)| -v2894(VarCurr).
% 94.31/93.69  0 [] v3027(VarCurr)| -v2977(VarCurr).
% 94.31/93.69  0 [] -v3019(VarCurr)|v3020(VarCurr)|v3025(VarCurr).
% 94.31/93.69  0 [] v3019(VarCurr)| -v3020(VarCurr).
% 94.31/93.69  0 [] v3019(VarCurr)| -v3025(VarCurr).
% 94.31/93.69  0 [] -v3025(VarCurr)|v2290(VarCurr,bitIndex1).
% 94.31/93.69  0 [] -v3025(VarCurr)|v2927(VarCurr,bitIndex1).
% 94.31/93.69  0 [] v3025(VarCurr)| -v2290(VarCurr,bitIndex1)| -v2927(VarCurr,bitIndex1).
% 94.31/93.69  0 [] -v3020(VarCurr)|v3021(VarCurr).
% 94.31/93.69  0 [] -v3020(VarCurr)|v3022(VarCurr).
% 94.31/93.69  0 [] v3020(VarCurr)| -v3021(VarCurr)| -v3022(VarCurr).
% 94.31/93.69  0 [] -v3022(VarCurr)|v3023(VarCurr).
% 94.31/93.69  0 [] -v3022(VarCurr)|v3024(VarCurr).
% 94.31/93.69  0 [] v3022(VarCurr)| -v3023(VarCurr)| -v3024(VarCurr).
% 94.31/93.69  0 [] -v3024(VarCurr)|v2290(VarCurr,bitIndex1)|v2927(VarCurr,bitIndex1).
% 94.31/93.69  0 [] v3024(VarCurr)| -v2290(VarCurr,bitIndex1).
% 94.31/93.69  0 [] v3024(VarCurr)| -v2927(VarCurr,bitIndex1).
% 94.31/93.69  0 [] -v3023(VarCurr)|v2899(VarCurr)|v2982(VarCurr).
% 94.31/93.69  0 [] v3023(VarCurr)| -v2899(VarCurr).
% 94.31/93.69  0 [] v3023(VarCurr)| -v2982(VarCurr).
% 94.31/93.69  0 [] -v3021(VarCurr)|v2290(VarCurr,bitIndex0).
% 94.31/93.69  0 [] -v3021(VarCurr)|v2927(VarCurr,bitIndex0).
% 94.31/93.69  0 [] v3021(VarCurr)| -v2290(VarCurr,bitIndex0)| -v2927(VarCurr,bitIndex0).
% 94.31/93.69  0 [] -v3007(VarCurr)| -v2290(VarCurr,bitIndex4)|$F.
% 94.31/93.69  0 [] -v3007(VarCurr)|v2290(VarCurr,bitIndex4)| -$F.
% 94.31/93.69  0 [] -v3007(VarCurr)| -v2290(VarCurr,bitIndex3)|$F.
% 94.31/93.69  0 [] -v3007(VarCurr)|v2290(VarCurr,bitIndex3)| -$F.
% 94.31/93.69  0 [] -v3007(VarCurr)| -v2290(VarCurr,bitIndex2)|$F.
% 94.31/93.69  0 [] -v3007(VarCurr)|v2290(VarCurr,bitIndex2)| -$F.
% 94.31/93.69  0 [] -v3007(VarCurr)| -v2290(VarCurr,bitIndex1)|$F.
% 94.31/93.69  0 [] -v3007(VarCurr)|v2290(VarCurr,bitIndex1)| -$F.
% 94.31/93.69  0 [] -v3007(VarCurr)| -v2290(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] -v3007(VarCurr)|v2290(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v3007(VarCurr)|v2290(VarCurr,bitIndex4)|$F|v2290(VarCurr,bitIndex3)|v2290(VarCurr,bitIndex2)|v2290(VarCurr,bitIndex1)|v2290(VarCurr,bitIndex0).
% 94.31/93.69  0 [] v3007(VarCurr)| -v2290(VarCurr,bitIndex4)| -$F| -v2290(VarCurr,bitIndex3)| -v2290(VarCurr,bitIndex2)| -v2290(VarCurr,bitIndex1)| -v2290(VarCurr,bitIndex0).
% 94.31/93.69  0 [] -v3004(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.69  0 [] -v3004(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.69  0 [] -v3004(VarCurr)| -v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] -v3004(VarCurr)|v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v3004(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] v3004(VarCurr)|v2261(VarCurr,bitIndex1)|$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v3004(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] v3004(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v2994(VarCurr)| -range_1_0(B)| -v2993(VarCurr,B)|v2996(VarCurr,B).
% 94.31/93.69  0 [] v2994(VarCurr)| -range_1_0(B)|v2993(VarCurr,B)| -v2996(VarCurr,B).
% 94.31/93.69  0 [] -v2994(VarCurr)| -range_1_0(B)| -v2993(VarCurr,B)|$F.
% 94.31/93.69  0 [] -v2994(VarCurr)| -range_1_0(B)|v2993(VarCurr,B)| -$F.
% 94.31/93.69  0 [] v2997(VarCurr)| -range_1_0(B)| -v2996(VarCurr,B)|b01(B).
% 94.31/93.69  0 [] v2997(VarCurr)| -range_1_0(B)|v2996(VarCurr,B)| -b01(B).
% 94.31/93.69  0 [] -v2997(VarCurr)| -range_1_0(B)| -v2996(VarCurr,B)|b10(B).
% 94.31/93.69  0 [] -v2997(VarCurr)| -range_1_0(B)|v2996(VarCurr,B)| -b10(B).
% 94.31/93.69  0 [] -v2997(VarCurr)|v320(VarCurr).
% 94.31/93.69  0 [] -v2997(VarCurr)|v2998(VarCurr).
% 94.31/93.69  0 [] v2997(VarCurr)| -v320(VarCurr)| -v2998(VarCurr).
% 94.31/93.69  0 [] v2998(VarCurr)|v3000(VarCurr).
% 94.31/93.69  0 [] -v2998(VarCurr)| -v3000(VarCurr).
% 94.31/93.69  0 [] -v3000(VarCurr)|v3001(VarCurr).
% 94.31/93.69  0 [] -v3000(VarCurr)|v2884(VarCurr).
% 94.31/93.69  0 [] v3000(VarCurr)| -v3001(VarCurr)| -v2884(VarCurr).
% 94.31/93.69  0 [] -v3001(VarCurr)|v3002(VarCurr).
% 94.31/93.69  0 [] -v3001(VarCurr)|v2889(VarCurr).
% 94.31/93.69  0 [] v3001(VarCurr)| -v3002(VarCurr)| -v2889(VarCurr).
% 94.31/93.69  0 [] -v3002(VarCurr)|v3003(VarCurr).
% 94.31/93.69  0 [] -v3002(VarCurr)|v2894(VarCurr).
% 94.31/93.69  0 [] v3002(VarCurr)| -v3003(VarCurr)| -v2894(VarCurr).
% 94.31/93.69  0 [] -v3003(VarCurr)|v2898(VarCurr).
% 94.31/93.69  0 [] -v3003(VarCurr)|v2899(VarCurr).
% 94.31/93.69  0 [] v3003(VarCurr)| -v2898(VarCurr)| -v2899(VarCurr).
% 94.31/93.69  0 [] -v2994(VarCurr)|v320(VarCurr).
% 94.31/93.69  0 [] -v2994(VarCurr)|v2995(VarCurr).
% 94.31/93.69  0 [] v2994(VarCurr)| -v320(VarCurr)| -v2995(VarCurr).
% 94.31/93.69  0 [] -v2995(VarCurr)| -v2290(VarCurr,bitIndex4)|$F.
% 94.31/93.69  0 [] -v2995(VarCurr)|v2290(VarCurr,bitIndex4)| -$F.
% 94.31/93.69  0 [] -v2995(VarCurr)| -v2290(VarCurr,bitIndex3)|$F.
% 94.31/93.69  0 [] -v2995(VarCurr)|v2290(VarCurr,bitIndex3)| -$F.
% 94.31/93.69  0 [] -v2995(VarCurr)| -v2290(VarCurr,bitIndex2)|$F.
% 94.31/93.69  0 [] -v2995(VarCurr)|v2290(VarCurr,bitIndex2)| -$F.
% 94.31/93.69  0 [] -v2995(VarCurr)| -v2290(VarCurr,bitIndex1)|$F.
% 94.31/93.69  0 [] -v2995(VarCurr)|v2290(VarCurr,bitIndex1)| -$F.
% 94.31/93.69  0 [] -v2995(VarCurr)| -v2290(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] -v2995(VarCurr)|v2290(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v2995(VarCurr)|v2290(VarCurr,bitIndex4)|$F|v2290(VarCurr,bitIndex3)|v2290(VarCurr,bitIndex2)|v2290(VarCurr,bitIndex1)|v2290(VarCurr,bitIndex0).
% 94.31/93.69  0 [] v2995(VarCurr)| -v2290(VarCurr,bitIndex4)| -$F| -v2290(VarCurr,bitIndex3)| -v2290(VarCurr,bitIndex2)| -v2290(VarCurr,bitIndex1)| -v2290(VarCurr,bitIndex0).
% 94.31/93.69  0 [] -v2992(VarCurr)| -v2261(VarCurr,bitIndex1)|$F.
% 94.31/93.69  0 [] -v2992(VarCurr)|v2261(VarCurr,bitIndex1)| -$F.
% 94.31/93.69  0 [] -v2992(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.69  0 [] -v2992(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.69  0 [] v2992(VarCurr)|v2261(VarCurr,bitIndex1)|$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.69  0 [] v2992(VarCurr)|v2261(VarCurr,bitIndex1)|$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.69  0 [] v2992(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.69  0 [] v2992(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.69  0 [] v2265(VarCurr)| -range_1_0(B)| -v2989(VarCurr,B)|v2990(VarCurr,B).
% 94.31/93.69  0 [] v2265(VarCurr)| -range_1_0(B)|v2989(VarCurr,B)| -v2990(VarCurr,B).
% 94.31/93.69  0 [] -v2265(VarCurr)| -range_1_0(B)| -v2989(VarCurr,B)|$F.
% 94.31/93.69  0 [] -v2265(VarCurr)| -range_1_0(B)|v2989(VarCurr,B)| -$F.
% 94.31/93.69  0 [] v741(VarCurr)| -range_1_0(B)| -v2990(VarCurr,B)|v2991(VarCurr,B).
% 94.31/93.69  0 [] v741(VarCurr)| -range_1_0(B)|v2990(VarCurr,B)| -v2991(VarCurr,B).
% 94.31/93.69  0 [] -v741(VarCurr)| -range_1_0(B)| -v2990(VarCurr,B)|b01(B).
% 94.31/93.69  0 [] -v741(VarCurr)| -range_1_0(B)|v2990(VarCurr,B)| -b01(B).
% 94.31/93.69  0 [] v2275(VarCurr)| -range_1_0(B)| -v2991(VarCurr,B)|$F.
% 94.31/93.69  0 [] v2275(VarCurr)| -range_1_0(B)|v2991(VarCurr,B)| -$F.
% 94.31/93.69  0 [] -v2275(VarCurr)| -range_1_0(B)| -v2991(VarCurr,B)|b10(B).
% 94.31/93.69  0 [] -v2275(VarCurr)| -range_1_0(B)|v2991(VarCurr,B)| -b10(B).
% 94.31/93.69  0 [] -v2988(VarCurr)| -v2261(VarCurr,bitIndex1)|$F.
% 94.31/93.69  0 [] -v2988(VarCurr)|v2261(VarCurr,bitIndex1)| -$F.
% 94.31/93.69  0 [] -v2988(VarCurr)| -v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.69  0 [] -v2988(VarCurr)|v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.69  0 [] v2988(VarCurr)|v2261(VarCurr,bitIndex1)|$F|v2261(VarCurr,bitIndex0).
% 94.31/93.69  0 [] v2988(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F| -v2261(VarCurr,bitIndex0).
% 94.31/93.69  0 [] -nextState(VarCurr,VarNext)|v2940(VarNext)| -range_4_0(B)| -v2927(VarNext,B)|v2927(VarCurr,B).
% 94.31/93.69  0 [] -nextState(VarCurr,VarNext)|v2940(VarNext)| -range_4_0(B)|v2927(VarNext,B)| -v2927(VarCurr,B).
% 94.31/93.69  0 [] -v2940(VarNext)| -range_4_0(B)| -v2927(VarNext,B)|v2957(VarNext,B).
% 94.31/93.69  0 [] -v2940(VarNext)| -range_4_0(B)|v2927(VarNext,B)| -v2957(VarNext,B).
% 94.31/93.69  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v2957(VarNext,B)|v2955(VarCurr,B).
% 94.31/93.69  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v2957(VarNext,B)| -v2955(VarCurr,B).
% 94.31/93.69  0 [] v2952(VarCurr)| -range_4_0(B)| -v2955(VarCurr,B)|v2958(VarCurr,B).
% 94.31/93.69  0 [] v2952(VarCurr)| -range_4_0(B)|v2955(VarCurr,B)| -v2958(VarCurr,B).
% 94.31/93.69  0 [] -v2952(VarCurr)| -range_4_0(B)| -v2955(VarCurr,B)|$F.
% 94.31/93.69  0 [] -v2952(VarCurr)| -range_4_0(B)|v2955(VarCurr,B)| -$F.
% 94.31/93.69  0 [] v2929(VarCurr)| -range_4_0(B)| -v2958(VarCurr,B)|v2959(VarCurr,B).
% 94.31/93.69  0 [] v2929(VarCurr)| -range_4_0(B)|v2958(VarCurr,B)| -v2959(VarCurr,B).
% 94.31/93.69  0 [] -v2929(VarCurr)| -range_4_0(B)| -v2958(VarCurr,B)|$F.
% 94.31/93.69  0 [] -v2929(VarCurr)| -range_4_0(B)|v2958(VarCurr,B)| -$F.
% 94.31/93.69  0 [] -v2959(VarCurr,bitIndex0)|v2981(VarCurr).
% 94.31/93.69  0 [] v2959(VarCurr,bitIndex0)| -v2981(VarCurr).
% 94.31/93.69  0 [] -v2959(VarCurr,bitIndex1)|v2979(VarCurr).
% 94.31/93.69  0 [] v2959(VarCurr,bitIndex1)| -v2979(VarCurr).
% 94.31/93.69  0 [] -v2959(VarCurr,bitIndex2)|v2974(VarCurr).
% 94.31/93.69  0 [] v2959(VarCurr,bitIndex2)| -v2974(VarCurr).
% 94.31/93.69  0 [] -v2959(VarCurr,bitIndex3)|v2969(VarCurr).
% 94.31/93.69  0 [] v2959(VarCurr,bitIndex3)| -v2969(VarCurr).
% 94.31/93.69  0 [] -v2959(VarCurr,bitIndex4)|v2961(VarCurr).
% 94.31/93.69  0 [] v2959(VarCurr,bitIndex4)| -v2961(VarCurr).
% 94.31/93.69  0 [] -v2979(VarCurr)|v2980(VarCurr).
% 94.31/93.69  0 [] -v2979(VarCurr)|v2983(VarCurr).
% 94.31/93.69  0 [] v2979(VarCurr)| -v2980(VarCurr)| -v2983(VarCurr).
% 94.31/93.70  0 [] -v2983(VarCurr)|v2927(VarCurr,bitIndex0)|v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] v2983(VarCurr)| -v2927(VarCurr,bitIndex0).
% 94.31/93.70  0 [] v2983(VarCurr)| -v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] -v2980(VarCurr)|v2981(VarCurr)|v2982(VarCurr).
% 94.31/93.70  0 [] v2980(VarCurr)| -v2981(VarCurr).
% 94.31/93.70  0 [] v2980(VarCurr)| -v2982(VarCurr).
% 94.31/93.70  0 [] v2982(VarCurr)|v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] -v2982(VarCurr)| -v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] v2981(VarCurr)|v2927(VarCurr,bitIndex0).
% 94.31/93.70  0 [] -v2981(VarCurr)| -v2927(VarCurr,bitIndex0).
% 94.31/93.70  0 [] -v2974(VarCurr)|v2975(VarCurr).
% 94.31/93.70  0 [] -v2974(VarCurr)|v2978(VarCurr).
% 94.31/93.70  0 [] v2974(VarCurr)| -v2975(VarCurr)| -v2978(VarCurr).
% 94.31/93.70  0 [] -v2978(VarCurr)|v2966(VarCurr)|v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] v2978(VarCurr)| -v2966(VarCurr).
% 94.31/93.70  0 [] v2978(VarCurr)| -v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] -v2975(VarCurr)|v2976(VarCurr)|v2977(VarCurr).
% 94.31/93.70  0 [] v2975(VarCurr)| -v2976(VarCurr).
% 94.31/93.70  0 [] v2975(VarCurr)| -v2977(VarCurr).
% 94.31/93.70  0 [] v2977(VarCurr)|v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] -v2977(VarCurr)| -v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] v2976(VarCurr)|v2966(VarCurr).
% 94.31/93.70  0 [] -v2976(VarCurr)| -v2966(VarCurr).
% 94.31/93.70  0 [] -v2969(VarCurr)|v2970(VarCurr).
% 94.31/93.70  0 [] -v2969(VarCurr)|v2973(VarCurr).
% 94.31/93.70  0 [] v2969(VarCurr)| -v2970(VarCurr)| -v2973(VarCurr).
% 94.31/93.70  0 [] -v2973(VarCurr)|v2965(VarCurr)|v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] v2973(VarCurr)| -v2965(VarCurr).
% 94.31/93.70  0 [] v2973(VarCurr)| -v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] -v2970(VarCurr)|v2971(VarCurr)|v2972(VarCurr).
% 94.31/93.70  0 [] v2970(VarCurr)| -v2971(VarCurr).
% 94.31/93.70  0 [] v2970(VarCurr)| -v2972(VarCurr).
% 94.31/93.70  0 [] v2972(VarCurr)|v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] -v2972(VarCurr)| -v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] v2971(VarCurr)|v2965(VarCurr).
% 94.31/93.70  0 [] -v2971(VarCurr)| -v2965(VarCurr).
% 94.31/93.70  0 [] -v2961(VarCurr)|v2962(VarCurr).
% 94.31/93.70  0 [] -v2961(VarCurr)|v2968(VarCurr).
% 94.31/93.70  0 [] v2961(VarCurr)| -v2962(VarCurr)| -v2968(VarCurr).
% 94.31/93.70  0 [] -v2968(VarCurr)|v2964(VarCurr)|v2927(VarCurr,bitIndex4).
% 94.31/93.70  0 [] v2968(VarCurr)| -v2964(VarCurr).
% 94.31/93.70  0 [] v2968(VarCurr)| -v2927(VarCurr,bitIndex4).
% 94.31/93.70  0 [] -v2962(VarCurr)|v2963(VarCurr)|v2967(VarCurr).
% 94.31/93.70  0 [] v2962(VarCurr)| -v2963(VarCurr).
% 94.31/93.70  0 [] v2962(VarCurr)| -v2967(VarCurr).
% 94.31/93.70  0 [] v2967(VarCurr)|v2927(VarCurr,bitIndex4).
% 94.31/93.70  0 [] -v2967(VarCurr)| -v2927(VarCurr,bitIndex4).
% 94.31/93.70  0 [] v2963(VarCurr)|v2964(VarCurr).
% 94.31/93.70  0 [] -v2963(VarCurr)| -v2964(VarCurr).
% 94.31/93.70  0 [] -v2964(VarCurr)|v2965(VarCurr).
% 94.31/93.70  0 [] -v2964(VarCurr)|v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] v2964(VarCurr)| -v2965(VarCurr)| -v2927(VarCurr,bitIndex3).
% 94.31/93.70  0 [] -v2965(VarCurr)|v2966(VarCurr).
% 94.31/93.70  0 [] -v2965(VarCurr)|v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] v2965(VarCurr)| -v2966(VarCurr)| -v2927(VarCurr,bitIndex2).
% 94.31/93.70  0 [] -v2966(VarCurr)|v2927(VarCurr,bitIndex0).
% 94.31/93.70  0 [] -v2966(VarCurr)|v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] v2966(VarCurr)| -v2927(VarCurr,bitIndex0)| -v2927(VarCurr,bitIndex1).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2940(VarNext)|v2941(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2940(VarNext)|v2948(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2940(VarNext)| -v2941(VarNext)| -v2948(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2948(VarNext)|v2946(VarCurr).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2948(VarNext)| -v2946(VarCurr).
% 94.31/93.70  0 [] -v2946(VarCurr)|v2949(VarCurr)|v2952(VarCurr).
% 94.31/93.70  0 [] v2946(VarCurr)| -v2949(VarCurr).
% 94.31/93.70  0 [] v2946(VarCurr)| -v2952(VarCurr).
% 94.31/93.70  0 [] -v2952(VarCurr)|v2953(VarCurr)|v2954(VarCurr).
% 94.31/93.70  0 [] v2952(VarCurr)| -v2953(VarCurr).
% 94.31/93.70  0 [] v2952(VarCurr)| -v2954(VarCurr).
% 94.31/93.70  0 [] v2954(VarCurr)|v1908(VarCurr).
% 94.31/93.70  0 [] -v2954(VarCurr)| -v1908(VarCurr).
% 94.31/93.70  0 [] v2953(VarCurr)|v12(VarCurr).
% 94.31/93.70  0 [] -v2953(VarCurr)| -v12(VarCurr).
% 94.31/93.70  0 [] -v2949(VarCurr)|v2950(VarCurr)|v2929(VarCurr).
% 94.31/93.70  0 [] v2949(VarCurr)| -v2950(VarCurr).
% 94.31/93.70  0 [] v2949(VarCurr)| -v2929(VarCurr).
% 94.31/93.70  0 [] -v2950(VarCurr)|v2265(VarCurr).
% 94.31/93.70  0 [] -v2950(VarCurr)|v2951(VarCurr).
% 94.31/93.70  0 [] v2950(VarCurr)| -v2265(VarCurr)| -v2951(VarCurr).
% 94.31/93.70  0 [] -v2951(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.70  0 [] -v2951(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.70  0 [] -v2951(VarCurr)| -v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] -v2951(VarCurr)|v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] v2951(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] v2951(VarCurr)|v2261(VarCurr,bitIndex1)|$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] v2951(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] v2951(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2941(VarNext)|v2942(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2941(VarNext)|v288(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2941(VarNext)| -v2942(VarNext)| -v288(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2942(VarNext)|v1891(VarNext).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -v2942(VarNext)| -v1891(VarNext).
% 94.31/93.70  0 [] -range_4_0(B)| -v2927(constB0,B)|$F.
% 94.31/93.70  0 [] -range_4_0(B)|v2927(constB0,B)| -$F.
% 94.31/93.70  0 [] -v2929(VarCurr)|v2931(VarCurr)|v2933(VarCurr).
% 94.31/93.70  0 [] v2929(VarCurr)| -v2931(VarCurr).
% 94.31/93.70  0 [] v2929(VarCurr)| -v2933(VarCurr).
% 94.31/93.70  0 [] -v2933(VarCurr)|v2934(VarCurr).
% 94.31/93.70  0 [] -v2933(VarCurr)|v2937(VarCurr).
% 94.31/93.70  0 [] v2933(VarCurr)| -v2934(VarCurr)| -v2937(VarCurr).
% 94.31/93.70  0 [] -v2937(VarCurr)| -v2290(VarCurr,bitIndex4)|$F.
% 94.31/93.70  0 [] -v2937(VarCurr)|v2290(VarCurr,bitIndex4)| -$F.
% 94.31/93.70  0 [] -v2937(VarCurr)| -v2290(VarCurr,bitIndex3)|$F.
% 94.31/93.70  0 [] -v2937(VarCurr)|v2290(VarCurr,bitIndex3)| -$F.
% 94.31/93.70  0 [] -v2937(VarCurr)| -v2290(VarCurr,bitIndex2)|$F.
% 94.31/93.70  0 [] -v2937(VarCurr)|v2290(VarCurr,bitIndex2)| -$F.
% 94.31/93.70  0 [] -v2937(VarCurr)| -v2290(VarCurr,bitIndex1)|$F.
% 94.31/93.70  0 [] -v2937(VarCurr)|v2290(VarCurr,bitIndex1)| -$F.
% 94.31/93.70  0 [] -v2937(VarCurr)| -v2290(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] -v2937(VarCurr)|v2290(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] v2937(VarCurr)|v2290(VarCurr,bitIndex4)|$F|v2290(VarCurr,bitIndex3)|v2290(VarCurr,bitIndex2)|v2290(VarCurr,bitIndex1)|v2290(VarCurr,bitIndex0).
% 94.31/93.70  0 [] v2937(VarCurr)| -v2290(VarCurr,bitIndex4)| -$F| -v2290(VarCurr,bitIndex3)| -v2290(VarCurr,bitIndex2)| -v2290(VarCurr,bitIndex1)| -v2290(VarCurr,bitIndex0).
% 94.31/93.70  0 [] -v2934(VarCurr)|v2935(VarCurr)|v2936(VarCurr).
% 94.31/93.70  0 [] v2934(VarCurr)| -v2935(VarCurr).
% 94.31/93.70  0 [] v2934(VarCurr)| -v2936(VarCurr).
% 94.31/93.70  0 [] -v2936(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.70  0 [] -v2936(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.70  0 [] -v2936(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.70  0 [] -v2936(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.70  0 [] v2936(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0).
% 94.31/93.70  0 [] v2936(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0).
% 94.31/93.70  0 [] -v2935(VarCurr)| -v2261(VarCurr,bitIndex1)|$T.
% 94.31/93.70  0 [] -v2935(VarCurr)|v2261(VarCurr,bitIndex1)| -$T.
% 94.31/93.70  0 [] -v2935(VarCurr)| -v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] -v2935(VarCurr)|v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] v2935(VarCurr)|v2261(VarCurr,bitIndex1)|$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] v2935(VarCurr)|v2261(VarCurr,bitIndex1)|$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] v2935(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T|v2261(VarCurr,bitIndex0)|$F.
% 94.31/93.70  0 [] v2935(VarCurr)| -v2261(VarCurr,bitIndex1)| -$T| -v2261(VarCurr,bitIndex0)| -$F.
% 94.31/93.70  0 [] -v2931(VarCurr)|v2932(VarCurr).
% 94.31/93.70  0 [] -v2931(VarCurr)|v320(VarCurr).
% 94.31/93.70  0 [] v2931(VarCurr)| -v2932(VarCurr)| -v320(VarCurr).
% 94.31/93.70  0 [] -v2932(VarCurr)| -v2261(VarCurr,bitIndex1)|$F.
% 94.31/93.70  0 [] -v2932(VarCurr)|v2261(VarCurr,bitIndex1)| -$F.
% 94.31/93.70  0 [] -v2932(VarCurr)| -v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.70  0 [] -v2932(VarCurr)|v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.70  0 [] v2932(VarCurr)|v2261(VarCurr,bitIndex1)|$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.70  0 [] v2932(VarCurr)|v2261(VarCurr,bitIndex1)|$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.70  0 [] v2932(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F|v2261(VarCurr,bitIndex0)|$T.
% 94.31/93.70  0 [] v2932(VarCurr)| -v2261(VarCurr,bitIndex1)| -$F| -v2261(VarCurr,bitIndex0)| -$T.
% 94.31/93.70  0 [] -range_1_0(B)| -v2261(constB0,B)|$F.
% 94.31/93.70  0 [] -range_1_0(B)|v2261(constB0,B)| -$F.
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2855(VarNext)| -range_4_0(B)| -v2290(VarNext,B)|v2290(VarCurr,B).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)|v2855(VarNext)| -range_4_0(B)|v2290(VarNext,B)| -v2290(VarCurr,B).
% 94.31/93.70  0 [] -v2855(VarNext)| -range_4_0(B)| -v2290(VarNext,B)|v2874(VarNext,B).
% 94.31/93.70  0 [] -v2855(VarNext)| -range_4_0(B)|v2290(VarNext,B)| -v2874(VarNext,B).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v2874(VarNext,B)|v2872(VarCurr,B).
% 94.31/93.70  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v2874(VarNext,B)| -v2872(VarCurr,B).
% 94.31/93.70  0 [] v2869(VarCurr)| -range_4_0(B)| -v2872(VarCurr,B)|v2875(VarCurr,B).
% 94.31/93.70  0 [] v2869(VarCurr)| -range_4_0(B)|v2872(VarCurr,B)| -v2875(VarCurr,B).
% 94.31/93.70  0 [] -v2869(VarCurr)| -range_4_0(B)| -v2872(VarCurr,B)|$F.
% 94.31/93.71  0 [] -v2869(VarCurr)| -range_4_0(B)|v2872(VarCurr,B)| -$F.
% 94.31/93.71  0 [] v2867(VarCurr)| -range_4_0(B)| -v2875(VarCurr,B)|v2901(VarCurr,B).
% 94.31/93.71  0 [] v2867(VarCurr)| -range_4_0(B)|v2875(VarCurr,B)| -v2901(VarCurr,B).
% 94.31/93.71  0 [] -v2867(VarCurr)| -range_4_0(B)| -v2875(VarCurr,B)|v2876(VarCurr,B).
% 94.31/93.71  0 [] -v2867(VarCurr)| -range_4_0(B)|v2875(VarCurr,B)| -v2876(VarCurr,B).
% 94.31/93.71  0 [] -v2901(VarCurr,bitIndex0)|v2898(VarCurr).
% 94.31/93.71  0 [] v2901(VarCurr,bitIndex0)| -v2898(VarCurr).
% 94.31/93.71  0 [] -v2901(VarCurr,bitIndex1)|v2921(VarCurr).
% 94.31/93.71  0 [] v2901(VarCurr,bitIndex1)| -v2921(VarCurr).
% 94.31/93.71  0 [] -v2901(VarCurr,bitIndex2)|v2917(VarCurr).
% 94.31/93.71  0 [] v2901(VarCurr,bitIndex2)| -v2917(VarCurr).
% 94.31/93.71  0 [] -v2901(VarCurr,bitIndex3)|v2913(VarCurr).
% 94.31/93.71  0 [] v2901(VarCurr,bitIndex3)| -v2913(VarCurr).
% 94.31/93.71  0 [] -v2901(VarCurr,bitIndex4)|v2903(VarCurr).
% 94.31/93.71  0 [] v2901(VarCurr,bitIndex4)| -v2903(VarCurr).
% 94.31/93.71  0 [] -v2921(VarCurr)|v2922(VarCurr).
% 94.31/93.71  0 [] -v2921(VarCurr)|v2923(VarCurr).
% 94.31/93.71  0 [] v2921(VarCurr)| -v2922(VarCurr)| -v2923(VarCurr).
% 94.31/93.71  0 [] -v2923(VarCurr)|v2290(VarCurr,bitIndex0)|v2899(VarCurr).
% 94.31/93.71  0 [] v2923(VarCurr)| -v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] v2923(VarCurr)| -v2899(VarCurr).
% 94.31/93.71  0 [] -v2922(VarCurr)|v2898(VarCurr)|v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] v2922(VarCurr)| -v2898(VarCurr).
% 94.31/93.71  0 [] v2922(VarCurr)| -v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] -v2917(VarCurr)|v2918(VarCurr).
% 94.31/93.71  0 [] -v2917(VarCurr)|v2920(VarCurr).
% 94.31/93.71  0 [] v2917(VarCurr)| -v2918(VarCurr)| -v2920(VarCurr).
% 94.31/93.71  0 [] -v2920(VarCurr)|v2894(VarCurr)|v2910(VarCurr).
% 94.31/93.71  0 [] v2920(VarCurr)| -v2894(VarCurr).
% 94.31/93.71  0 [] v2920(VarCurr)| -v2910(VarCurr).
% 94.31/93.71  0 [] -v2918(VarCurr)|v2290(VarCurr,bitIndex2)|v2919(VarCurr).
% 94.31/93.71  0 [] v2918(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] v2918(VarCurr)| -v2919(VarCurr).
% 94.31/93.71  0 [] v2919(VarCurr)|v2910(VarCurr).
% 94.31/93.71  0 [] -v2919(VarCurr)| -v2910(VarCurr).
% 94.31/93.71  0 [] -v2913(VarCurr)|v2914(VarCurr).
% 94.31/93.71  0 [] -v2913(VarCurr)|v2916(VarCurr).
% 94.31/93.71  0 [] v2913(VarCurr)| -v2914(VarCurr)| -v2916(VarCurr).
% 94.31/93.71  0 [] -v2916(VarCurr)|v2889(VarCurr)|v2908(VarCurr).
% 94.31/93.71  0 [] v2916(VarCurr)| -v2889(VarCurr).
% 94.31/93.71  0 [] v2916(VarCurr)| -v2908(VarCurr).
% 94.31/93.71  0 [] -v2914(VarCurr)|v2290(VarCurr,bitIndex3)|v2915(VarCurr).
% 94.31/93.71  0 [] v2914(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] v2914(VarCurr)| -v2915(VarCurr).
% 94.31/93.71  0 [] v2915(VarCurr)|v2908(VarCurr).
% 94.31/93.71  0 [] -v2915(VarCurr)| -v2908(VarCurr).
% 94.31/93.71  0 [] -v2903(VarCurr)|v2904(VarCurr).
% 94.31/93.71  0 [] -v2903(VarCurr)|v2912(VarCurr).
% 94.31/93.71  0 [] v2903(VarCurr)| -v2904(VarCurr)| -v2912(VarCurr).
% 94.31/93.71  0 [] -v2912(VarCurr)|v2884(VarCurr)|v2906(VarCurr).
% 94.31/93.71  0 [] v2912(VarCurr)| -v2884(VarCurr).
% 94.31/93.71  0 [] v2912(VarCurr)| -v2906(VarCurr).
% 94.31/93.71  0 [] -v2904(VarCurr)|v2290(VarCurr,bitIndex4)|v2905(VarCurr).
% 94.31/93.71  0 [] v2904(VarCurr)| -v2290(VarCurr,bitIndex4).
% 94.31/93.71  0 [] v2904(VarCurr)| -v2905(VarCurr).
% 94.31/93.71  0 [] v2905(VarCurr)|v2906(VarCurr).
% 94.31/93.71  0 [] -v2905(VarCurr)| -v2906(VarCurr).
% 94.31/93.71  0 [] -v2906(VarCurr)|v2290(VarCurr,bitIndex3)|v2907(VarCurr).
% 94.31/93.71  0 [] v2906(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] v2906(VarCurr)| -v2907(VarCurr).
% 94.31/93.71  0 [] -v2907(VarCurr)|v2889(VarCurr).
% 94.31/93.71  0 [] -v2907(VarCurr)|v2908(VarCurr).
% 94.31/93.71  0 [] v2907(VarCurr)| -v2889(VarCurr)| -v2908(VarCurr).
% 94.31/93.71  0 [] -v2908(VarCurr)|v2290(VarCurr,bitIndex2)|v2909(VarCurr).
% 94.31/93.71  0 [] v2908(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] v2908(VarCurr)| -v2909(VarCurr).
% 94.31/93.71  0 [] -v2909(VarCurr)|v2894(VarCurr).
% 94.31/93.71  0 [] -v2909(VarCurr)|v2910(VarCurr).
% 94.31/93.71  0 [] v2909(VarCurr)| -v2894(VarCurr)| -v2910(VarCurr).
% 94.31/93.71  0 [] -v2910(VarCurr)|v2290(VarCurr,bitIndex1)|v2911(VarCurr).
% 94.31/93.71  0 [] v2910(VarCurr)| -v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] v2910(VarCurr)| -v2911(VarCurr).
% 94.31/93.71  0 [] -v2911(VarCurr)|v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] -v2911(VarCurr)|v2899(VarCurr).
% 94.31/93.71  0 [] v2911(VarCurr)| -v2290(VarCurr,bitIndex0)| -v2899(VarCurr).
% 94.31/93.71  0 [] -v2876(VarCurr,bitIndex0)|v2898(VarCurr).
% 94.31/93.71  0 [] v2876(VarCurr,bitIndex0)| -v2898(VarCurr).
% 94.31/93.71  0 [] -v2876(VarCurr,bitIndex1)|v2896(VarCurr).
% 94.31/93.71  0 [] v2876(VarCurr,bitIndex1)| -v2896(VarCurr).
% 94.31/93.71  0 [] -v2876(VarCurr,bitIndex2)|v2891(VarCurr).
% 94.31/93.71  0 [] v2876(VarCurr,bitIndex2)| -v2891(VarCurr).
% 94.31/93.71  0 [] -v2876(VarCurr,bitIndex3)|v2886(VarCurr).
% 94.31/93.71  0 [] v2876(VarCurr,bitIndex3)| -v2886(VarCurr).
% 94.31/93.71  0 [] -v2876(VarCurr,bitIndex4)|v2878(VarCurr).
% 94.31/93.71  0 [] v2876(VarCurr,bitIndex4)| -v2878(VarCurr).
% 94.31/93.71  0 [] -v2896(VarCurr)|v2897(VarCurr).
% 94.31/93.71  0 [] -v2896(VarCurr)|v2900(VarCurr).
% 94.31/93.71  0 [] v2896(VarCurr)| -v2897(VarCurr)| -v2900(VarCurr).
% 94.31/93.71  0 [] -v2900(VarCurr)|v2290(VarCurr,bitIndex0)|v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] v2900(VarCurr)| -v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] v2900(VarCurr)| -v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] -v2897(VarCurr)|v2898(VarCurr)|v2899(VarCurr).
% 94.31/93.71  0 [] v2897(VarCurr)| -v2898(VarCurr).
% 94.31/93.71  0 [] v2897(VarCurr)| -v2899(VarCurr).
% 94.31/93.71  0 [] v2899(VarCurr)|v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] -v2899(VarCurr)| -v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] v2898(VarCurr)|v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] -v2898(VarCurr)| -v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] -v2891(VarCurr)|v2892(VarCurr).
% 94.31/93.71  0 [] -v2891(VarCurr)|v2895(VarCurr).
% 94.31/93.71  0 [] v2891(VarCurr)| -v2892(VarCurr)| -v2895(VarCurr).
% 94.31/93.71  0 [] -v2895(VarCurr)|v2883(VarCurr)|v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] v2895(VarCurr)| -v2883(VarCurr).
% 94.31/93.71  0 [] v2895(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] -v2892(VarCurr)|v2893(VarCurr)|v2894(VarCurr).
% 94.31/93.71  0 [] v2892(VarCurr)| -v2893(VarCurr).
% 94.31/93.71  0 [] v2892(VarCurr)| -v2894(VarCurr).
% 94.31/93.71  0 [] v2894(VarCurr)|v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] -v2894(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] v2893(VarCurr)|v2883(VarCurr).
% 94.31/93.71  0 [] -v2893(VarCurr)| -v2883(VarCurr).
% 94.31/93.71  0 [] -v2886(VarCurr)|v2887(VarCurr).
% 94.31/93.71  0 [] -v2886(VarCurr)|v2890(VarCurr).
% 94.31/93.71  0 [] v2886(VarCurr)| -v2887(VarCurr)| -v2890(VarCurr).
% 94.31/93.71  0 [] -v2890(VarCurr)|v2882(VarCurr)|v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] v2890(VarCurr)| -v2882(VarCurr).
% 94.31/93.71  0 [] v2890(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] -v2887(VarCurr)|v2888(VarCurr)|v2889(VarCurr).
% 94.31/93.71  0 [] v2887(VarCurr)| -v2888(VarCurr).
% 94.31/93.71  0 [] v2887(VarCurr)| -v2889(VarCurr).
% 94.31/93.71  0 [] v2889(VarCurr)|v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] -v2889(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] v2888(VarCurr)|v2882(VarCurr).
% 94.31/93.71  0 [] -v2888(VarCurr)| -v2882(VarCurr).
% 94.31/93.71  0 [] -v2878(VarCurr)|v2879(VarCurr).
% 94.31/93.71  0 [] -v2878(VarCurr)|v2885(VarCurr).
% 94.31/93.71  0 [] v2878(VarCurr)| -v2879(VarCurr)| -v2885(VarCurr).
% 94.31/93.71  0 [] -v2885(VarCurr)|v2881(VarCurr)|v2290(VarCurr,bitIndex4).
% 94.31/93.71  0 [] v2885(VarCurr)| -v2881(VarCurr).
% 94.31/93.71  0 [] v2885(VarCurr)| -v2290(VarCurr,bitIndex4).
% 94.31/93.71  0 [] -v2879(VarCurr)|v2880(VarCurr)|v2884(VarCurr).
% 94.31/93.71  0 [] v2879(VarCurr)| -v2880(VarCurr).
% 94.31/93.71  0 [] v2879(VarCurr)| -v2884(VarCurr).
% 94.31/93.71  0 [] v2884(VarCurr)|v2290(VarCurr,bitIndex4).
% 94.31/93.71  0 [] -v2884(VarCurr)| -v2290(VarCurr,bitIndex4).
% 94.31/93.71  0 [] v2880(VarCurr)|v2881(VarCurr).
% 94.31/93.71  0 [] -v2880(VarCurr)| -v2881(VarCurr).
% 94.31/93.71  0 [] -v2881(VarCurr)|v2882(VarCurr).
% 94.31/93.71  0 [] -v2881(VarCurr)|v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] v2881(VarCurr)| -v2882(VarCurr)| -v2290(VarCurr,bitIndex3).
% 94.31/93.71  0 [] -v2882(VarCurr)|v2883(VarCurr).
% 94.31/93.71  0 [] -v2882(VarCurr)|v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] v2882(VarCurr)| -v2883(VarCurr)| -v2290(VarCurr,bitIndex2).
% 94.31/93.71  0 [] -v2883(VarCurr)|v2290(VarCurr,bitIndex0).
% 94.31/93.71  0 [] -v2883(VarCurr)|v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] v2883(VarCurr)| -v2290(VarCurr,bitIndex0)| -v2290(VarCurr,bitIndex1).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)| -v2855(VarNext)|v2856(VarNext).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)| -v2855(VarNext)|v2863(VarNext).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)|v2855(VarNext)| -v2856(VarNext)| -v2863(VarNext).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)| -v2863(VarNext)|v2861(VarCurr).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)|v2863(VarNext)| -v2861(VarCurr).
% 94.31/93.71  0 [] -v2861(VarCurr)|v2864(VarCurr)|v2869(VarCurr).
% 94.31/93.71  0 [] v2861(VarCurr)| -v2864(VarCurr).
% 94.31/93.71  0 [] v2861(VarCurr)| -v2869(VarCurr).
% 94.31/93.71  0 [] -v2869(VarCurr)|v2870(VarCurr)|v2871(VarCurr).
% 94.31/93.71  0 [] v2869(VarCurr)| -v2870(VarCurr).
% 94.31/93.71  0 [] v2869(VarCurr)| -v2871(VarCurr).
% 94.31/93.71  0 [] v2871(VarCurr)|v1908(VarCurr).
% 94.31/93.71  0 [] -v2871(VarCurr)| -v1908(VarCurr).
% 94.31/93.71  0 [] v2870(VarCurr)|v12(VarCurr).
% 94.31/93.71  0 [] -v2870(VarCurr)| -v12(VarCurr).
% 94.31/93.71  0 [] -v2864(VarCurr)|v2865(VarCurr)|v2867(VarCurr).
% 94.31/93.71  0 [] v2864(VarCurr)| -v2865(VarCurr).
% 94.31/93.71  0 [] v2864(VarCurr)| -v2867(VarCurr).
% 94.31/93.71  0 [] -v2867(VarCurr)|v2275(VarCurr).
% 94.31/93.71  0 [] -v2867(VarCurr)|v2868(VarCurr).
% 94.31/93.71  0 [] v2867(VarCurr)| -v2275(VarCurr)| -v2868(VarCurr).
% 94.31/93.71  0 [] v2868(VarCurr)|v2292(VarCurr).
% 94.31/93.71  0 [] -v2868(VarCurr)| -v2292(VarCurr).
% 94.31/93.71  0 [] -v2865(VarCurr)|v2866(VarCurr).
% 94.31/93.71  0 [] -v2865(VarCurr)|v2292(VarCurr).
% 94.31/93.71  0 [] v2865(VarCurr)| -v2866(VarCurr)| -v2292(VarCurr).
% 94.31/93.71  0 [] v2866(VarCurr)|v2275(VarCurr).
% 94.31/93.71  0 [] -v2866(VarCurr)| -v2275(VarCurr).
% 94.31/93.71  0 [] -nextState(VarCurr,VarNext)| -v2856(VarNext)|v2857(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2856(VarNext)|v288(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2856(VarNext)| -v2857(VarNext)| -v288(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2857(VarNext)|v1891(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2857(VarNext)| -v1891(VarNext).
% 94.31/93.72  0 [] -range_4_0(B)| -v2290(constB0,B)|$F.
% 94.31/93.72  0 [] -range_4_0(B)|v2290(constB0,B)| -$F.
% 94.31/93.72  0 [] -v2292(VarCurr)|v2294(VarCurr).
% 94.31/93.72  0 [] -v2292(VarCurr)|v2852(VarCurr).
% 94.31/93.72  0 [] v2292(VarCurr)| -v2294(VarCurr)| -v2852(VarCurr).
% 94.31/93.72  0 [] -v2852(VarCurr)| -v2775(VarCurr)|$T.
% 94.31/93.72  0 [] -v2852(VarCurr)|v2775(VarCurr)| -$T.
% 94.31/93.72  0 [] v2852(VarCurr)|v2775(VarCurr)|$T.
% 94.31/93.72  0 [] v2852(VarCurr)| -v2775(VarCurr)| -$T.
% 94.31/93.72  0 [] -v2775(VarCurr)|v2777(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2775(VarCurr)| -v2777(VarCurr,bitIndex3).
% 94.31/93.72  0 [] -v2777(VarCurr,bitIndex3)|v2779(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2777(VarCurr,bitIndex3)| -v2779(VarCurr,bitIndex3).
% 94.31/93.72  0 [] -v2779(VarCurr,bitIndex3)|v2781(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2779(VarCurr,bitIndex3)| -v2781(VarCurr,bitIndex3).
% 94.31/93.72  0 [] -v2781(VarNext,bitIndex3)|v2836(VarNext,bitIndex3).
% 94.31/93.72  0 [] v2781(VarNext,bitIndex3)| -v2836(VarNext,bitIndex3).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2837(VarNext)| -range_3_0(B)| -v2836(VarNext,B)|v2781(VarCurr,B).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2837(VarNext)| -range_3_0(B)|v2836(VarNext,B)| -v2781(VarCurr,B).
% 94.31/93.72  0 [] -v2837(VarNext)| -range_3_0(B)| -v2836(VarNext,B)|v2847(VarNext,B).
% 94.31/93.72  0 [] -v2837(VarNext)| -range_3_0(B)|v2836(VarNext,B)| -v2847(VarNext,B).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v2847(VarNext,B)|v2845(VarCurr,B).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v2847(VarNext,B)| -v2845(VarCurr,B).
% 94.31/93.72  0 [] v2848(VarCurr)| -range_3_0(B)| -v2845(VarCurr,B)|v2785(VarCurr,B).
% 94.31/93.72  0 [] v2848(VarCurr)| -range_3_0(B)|v2845(VarCurr,B)| -v2785(VarCurr,B).
% 94.31/93.72  0 [] -v2848(VarCurr)| -range_3_0(B)| -v2845(VarCurr,B)|$F.
% 94.31/93.72  0 [] -v2848(VarCurr)| -range_3_0(B)|v2845(VarCurr,B)| -$F.
% 94.31/93.72  0 [] v2848(VarCurr)|v2783(VarCurr).
% 94.31/93.72  0 [] -v2848(VarCurr)| -v2783(VarCurr).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2837(VarNext)|v2838(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2837(VarNext)| -v2838(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2838(VarNext)|v2839(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2838(VarNext)|v2834(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2838(VarNext)| -v2839(VarNext)| -v2834(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2839(VarNext)|v2841(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2839(VarNext)| -v2841(VarNext).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)| -v2841(VarNext)|v2834(VarCurr).
% 94.31/93.72  0 [] -nextState(VarCurr,VarNext)|v2841(VarNext)| -v2834(VarCurr).
% 94.31/93.72  0 [] -v2834(VarCurr)|v195(VarCurr).
% 94.31/93.72  0 [] v2834(VarCurr)| -v195(VarCurr).
% 94.31/93.72  0 [] -v2785(VarCurr,bitIndex3)|v2832(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2785(VarCurr,bitIndex3)| -v2832(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2787(VarCurr)| -range_3_0(B)| -v2832(VarCurr,B)|v2793(VarCurr,B).
% 94.31/93.72  0 [] v2787(VarCurr)| -range_3_0(B)|v2832(VarCurr,B)| -v2793(VarCurr,B).
% 94.31/93.72  0 [] -v2787(VarCurr)| -range_3_0(B)| -v2832(VarCurr,B)|b0011(B).
% 94.31/93.72  0 [] -v2787(VarCurr)| -range_3_0(B)|v2832(VarCurr,B)| -b0011(B).
% 94.31/93.72  0 [] -v2793(VarCurr,bitIndex3)|v2804(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2793(VarCurr,bitIndex3)| -v2804(VarCurr,bitIndex3).
% 94.31/93.72  0 [] v2805(VarCurr)| -range_3_0(B)| -v2804(VarCurr,B)|$F.
% 94.31/93.72  0 [] v2805(VarCurr)| -range_3_0(B)|v2804(VarCurr,B)| -$F.
% 94.31/93.72  0 [] -v2805(VarCurr)| -range_3_0(B)| -v2804(VarCurr,B)|v2828(VarCurr,B).
% 94.31/93.72  0 [] -v2805(VarCurr)| -range_3_0(B)|v2804(VarCurr,B)| -v2828(VarCurr,B).
% 94.31/93.72  0 [] v2810(VarCurr)|v2812(VarCurr)|v2815(VarCurr)|v2822(VarCurr)|v2823(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|v2831(VarCurr,B).
% 94.31/93.72  0 [] v2810(VarCurr)|v2812(VarCurr)|v2815(VarCurr)|v2822(VarCurr)|v2823(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -v2831(VarCurr,B).
% 94.31/93.72  0 [] -v2823(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|v2830(VarCurr,B).
% 94.31/93.72  0 [] -v2823(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -v2830(VarCurr,B).
% 94.31/93.72  0 [] -v2822(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|b0100(B).
% 94.31/93.72  0 [] -v2822(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -b0100(B).
% 94.31/93.72  0 [] -v2815(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|$F.
% 94.31/93.72  0 [] -v2815(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -$F.
% 94.31/93.72  0 [] -v2812(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|v2829(VarCurr,B).
% 94.31/93.72  0 [] -v2812(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -v2829(VarCurr,B).
% 94.31/93.72  0 [] -v2810(VarCurr)| -range_3_0(B)| -v2828(VarCurr,B)|b0010(B).
% 94.31/93.72  0 [] -v2810(VarCurr)| -range_3_0(B)|v2828(VarCurr,B)| -b0010(B).
% 94.31/93.72  0 [] v2803(VarCurr)| -range_3_0(B)| -v2831(VarCurr,B)|b1001(B).
% 94.31/93.72  0 [] v2803(VarCurr)| -range_3_0(B)|v2831(VarCurr,B)| -b1001(B).
% 94.31/93.72  0 [] -v2803(VarCurr)| -range_3_0(B)| -v2831(VarCurr,B)|b1000(B).
% 94.31/93.72  0 [] -v2803(VarCurr)| -range_3_0(B)|v2831(VarCurr,B)| -b1000(B).
% 94.31/93.72  0 [] v2825(VarCurr)| -range_3_0(B)| -v2830(VarCurr,B)|b1010(B).
% 94.31/93.72  0 [] v2825(VarCurr)| -range_3_0(B)|v2830(VarCurr,B)| -b1010(B).
% 94.31/93.72  0 [] -v2825(VarCurr)| -range_3_0(B)| -v2830(VarCurr,B)|b1011(B).
% 94.31/93.72  0 [] -v2825(VarCurr)| -range_3_0(B)|v2830(VarCurr,B)| -b1011(B).
% 94.31/93.72  0 [] v2803(VarCurr)| -range_3_0(B)| -v2829(VarCurr,B)|$F.
% 94.31/93.72  0 [] v2803(VarCurr)| -range_3_0(B)|v2829(VarCurr,B)| -$F.
% 94.31/93.72  0 [] -v2803(VarCurr)| -range_3_0(B)| -v2829(VarCurr,B)|b0001(B).
% 94.31/93.72  0 [] -v2803(VarCurr)| -range_3_0(B)|v2829(VarCurr,B)| -b0001(B).
% 94.31/93.72  0 [] -v2805(VarCurr)|v2806(VarCurr)|v2827(VarCurr).
% 94.31/93.72  0 [] v2805(VarCurr)| -v2806(VarCurr).
% 94.31/93.72  0 [] v2805(VarCurr)| -v2827(VarCurr).
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex1)|$T.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex1)| -$T.
% 94.31/93.72  0 [] -v2827(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.31/93.72  0 [] -v2827(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.31/93.72  0 [] v2827(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.31/93.72  0 [] v2827(VarCurr)|v158(VarCurr,bitIndex6)|$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.31/93.72  0 [] v2827(VarCurr)| -v158(VarCurr,bitIndex6)| -$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.31/93.72  0 [] v2827(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.31/93.72  0 [] b1001010(bitIndex6).
% 94.31/93.72  0 [] -b1001010(bitIndex5).
% 94.31/93.72  0 [] -b1001010(bitIndex4).
% 94.31/93.72  0 [] b1001010(bitIndex3).
% 94.31/93.72  0 [] -b1001010(bitIndex2).
% 94.31/93.72  0 [] b1001010(bitIndex1).
% 94.31/93.72  0 [] -b1001010(bitIndex0).
% 94.31/93.72  0 [] -v2806(VarCurr)|v2807(VarCurr)|v2823(VarCurr).
% 94.31/93.72  0 [] v2806(VarCurr)| -v2807(VarCurr).
% 94.31/93.72  0 [] v2806(VarCurr)| -v2823(VarCurr).
% 94.31/93.72  0 [] -v2823(VarCurr)|v2824(VarCurr).
% 94.31/93.72  0 [] -v2823(VarCurr)|v2750(VarCurr).
% 94.31/93.72  0 [] v2823(VarCurr)| -v2824(VarCurr)| -v2750(VarCurr).
% 94.31/93.72  0 [] -v2824(VarCurr)|v2825(VarCurr)|v2826(VarCurr).
% 94.31/93.72  0 [] v2824(VarCurr)| -v2825(VarCurr).
% 94.31/93.72  0 [] v2824(VarCurr)| -v2826(VarCurr).
% 94.31/93.72  0 [] -v2826(VarCurr)| -v145(VarCurr,bitIndex2)|$T.
% 94.31/93.72  0 [] -v2826(VarCurr)|v145(VarCurr,bitIndex2)| -$T.
% 94.31/93.72  0 [] -v2826(VarCurr)| -v145(VarCurr,bitIndex1)|$T.
% 94.31/93.72  0 [] -v2826(VarCurr)|v145(VarCurr,bitIndex1)| -$T.
% 94.31/93.72  0 [] -v2826(VarCurr)| -v145(VarCurr,bitIndex0)|$T.
% 94.31/93.72  0 [] -v2826(VarCurr)|v145(VarCurr,bitIndex0)| -$T.
% 94.31/93.72  0 [] v2826(VarCurr)|v145(VarCurr,bitIndex2)|$T|v145(VarCurr,bitIndex1)|v145(VarCurr,bitIndex0).
% 94.31/93.72  0 [] v2826(VarCurr)| -v145(VarCurr,bitIndex2)| -$T| -v145(VarCurr,bitIndex1)| -v145(VarCurr,bitIndex0).
% 94.31/93.72  0 [] -v2825(VarCurr)| -v145(VarCurr,bitIndex2)|$F.
% 94.31/93.72  0 [] -v2825(VarCurr)|v145(VarCurr,bitIndex2)| -$F.
% 94.31/93.72  0 [] -v2825(VarCurr)| -v145(VarCurr,bitIndex1)|$F.
% 94.31/93.72  0 [] -v2825(VarCurr)|v145(VarCurr,bitIndex1)| -$F.
% 94.31/93.72  0 [] -v2825(VarCurr)| -v145(VarCurr,bitIndex0)|$T.
% 94.31/93.72  0 [] -v2825(VarCurr)|v145(VarCurr,bitIndex0)| -$T.
% 94.31/93.72  0 [] v2825(VarCurr)|v145(VarCurr,bitIndex2)|$F|v145(VarCurr,bitIndex1)|v145(VarCurr,bitIndex0)|$T.
% 94.37/93.73  0 [] v2825(VarCurr)|v145(VarCurr,bitIndex2)|$F|v145(VarCurr,bitIndex1)| -v145(VarCurr,bitIndex0)| -$T.
% 94.37/93.73  0 [] v2825(VarCurr)| -v145(VarCurr,bitIndex2)| -$F| -v145(VarCurr,bitIndex1)|v145(VarCurr,bitIndex0)|$T.
% 94.37/93.73  0 [] v2825(VarCurr)| -v145(VarCurr,bitIndex2)| -$F| -v145(VarCurr,bitIndex1)| -v145(VarCurr,bitIndex0)| -$T.
% 94.37/93.73  0 [] -v2807(VarCurr)|v2808(VarCurr)|v2822(VarCurr).
% 94.37/93.73  0 [] v2807(VarCurr)| -v2808(VarCurr).
% 94.37/93.73  0 [] v2807(VarCurr)| -v2822(VarCurr).
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex4)|$T.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex4)| -$T.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex1)|$T.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex1)| -$T.
% 94.37/93.73  0 [] -v2822(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2822(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2822(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|$F|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2822(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -$F|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2822(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|$F| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2822(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -$F| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1111010(bitIndex6).
% 94.37/93.73  0 [] b1111010(bitIndex5).
% 94.37/93.73  0 [] b1111010(bitIndex4).
% 94.37/93.73  0 [] b1111010(bitIndex3).
% 94.37/93.73  0 [] -b1111010(bitIndex2).
% 94.37/93.73  0 [] b1111010(bitIndex1).
% 94.37/93.73  0 [] -b1111010(bitIndex0).
% 94.37/93.73  0 [] -v2808(VarCurr)|v2809(VarCurr)|v2815(VarCurr).
% 94.37/93.73  0 [] v2808(VarCurr)| -v2809(VarCurr).
% 94.37/93.73  0 [] v2808(VarCurr)| -v2815(VarCurr).
% 94.37/93.73  0 [] -v2815(VarCurr)|v2816(VarCurr)|v2821(VarCurr).
% 94.37/93.73  0 [] v2815(VarCurr)| -v2816(VarCurr).
% 94.37/93.73  0 [] v2815(VarCurr)| -v2821(VarCurr).
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex4)|$T.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex4)| -$T.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2821(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2821(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2821(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$F|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2821(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$F| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2821(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$F|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2821(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$F| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1110000(bitIndex6).
% 94.37/93.73  0 [] b1110000(bitIndex5).
% 94.37/93.73  0 [] b1110000(bitIndex4).
% 94.37/93.73  0 [] -b1110000(bitIndex3).
% 94.37/93.73  0 [] -b1110000(bitIndex2).
% 94.37/93.73  0 [] -b1110000(bitIndex1).
% 94.37/93.73  0 [] -b1110000(bitIndex0).
% 94.37/93.73  0 [] -v2816(VarCurr)|v2817(VarCurr)|v2820(VarCurr).
% 94.37/93.73  0 [] v2816(VarCurr)| -v2817(VarCurr).
% 94.37/93.73  0 [] v2816(VarCurr)| -v2820(VarCurr).
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex4)|$T.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex4)| -$T.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2820(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2820(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2820(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2820(VarCurr)|v158(VarCurr,bitIndex6)|$T| -v158(VarCurr,bitIndex5)| -$F|v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2820(VarCurr)| -v158(VarCurr,bitIndex6)| -$T|v158(VarCurr,bitIndex5)|$F| -v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2820(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1010000(bitIndex6).
% 94.37/93.73  0 [] -b1010000(bitIndex5).
% 94.37/93.73  0 [] b1010000(bitIndex4).
% 94.37/93.73  0 [] -b1010000(bitIndex3).
% 94.37/93.73  0 [] -b1010000(bitIndex2).
% 94.37/93.73  0 [] -b1010000(bitIndex1).
% 94.37/93.73  0 [] -b1010000(bitIndex0).
% 94.37/93.73  0 [] -v2817(VarCurr)|v2818(VarCurr)|v2819(VarCurr).
% 94.37/93.73  0 [] v2817(VarCurr)| -v2818(VarCurr).
% 94.37/93.73  0 [] v2817(VarCurr)| -v2819(VarCurr).
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex4)|$T.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex4)| -$T.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2819(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2819(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2819(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|$F|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2819(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -$F| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2819(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|$F|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2819(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -$F| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1111000(bitIndex6).
% 94.37/93.73  0 [] b1111000(bitIndex5).
% 94.37/93.73  0 [] b1111000(bitIndex4).
% 94.37/93.73  0 [] b1111000(bitIndex3).
% 94.37/93.73  0 [] -b1111000(bitIndex2).
% 94.37/93.73  0 [] -b1111000(bitIndex1).
% 94.37/93.73  0 [] -b1111000(bitIndex0).
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex4)|$T.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex4)| -$T.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2818(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2818(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2818(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2818(VarCurr)|v158(VarCurr,bitIndex6)|$T| -v158(VarCurr,bitIndex5)| -$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2818(VarCurr)| -v158(VarCurr,bitIndex6)| -$T|v158(VarCurr,bitIndex5)|$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2818(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1011000(bitIndex6).
% 94.37/93.73  0 [] -b1011000(bitIndex5).
% 94.37/93.73  0 [] b1011000(bitIndex4).
% 94.37/93.73  0 [] b1011000(bitIndex3).
% 94.37/93.73  0 [] -b1011000(bitIndex2).
% 94.37/93.73  0 [] -b1011000(bitIndex1).
% 94.37/93.73  0 [] -b1011000(bitIndex0).
% 94.37/93.73  0 [] -v2809(VarCurr)|v2810(VarCurr)|v2812(VarCurr).
% 94.37/93.73  0 [] v2809(VarCurr)| -v2810(VarCurr).
% 94.37/93.73  0 [] v2809(VarCurr)| -v2812(VarCurr).
% 94.37/93.73  0 [] -v2812(VarCurr)|v2813(VarCurr)|v2814(VarCurr).
% 94.37/93.73  0 [] v2812(VarCurr)| -v2813(VarCurr).
% 94.37/93.73  0 [] v2812(VarCurr)| -v2814(VarCurr).
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2814(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2814(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2814(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|$F|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2814(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -$F| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2814(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|$F|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] v2814(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -$F| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.73  0 [] b1100000(bitIndex6).
% 94.37/93.73  0 [] b1100000(bitIndex5).
% 94.37/93.73  0 [] -b1100000(bitIndex4).
% 94.37/93.73  0 [] -b1100000(bitIndex3).
% 94.37/93.73  0 [] -b1100000(bitIndex2).
% 94.37/93.73  0 [] -b1100000(bitIndex1).
% 94.37/93.73  0 [] -b1100000(bitIndex0).
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex6)|$T.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex6)| -$T.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.37/93.73  0 [] -v2813(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.73  0 [] -v2813(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.73  0 [] v2813(VarCurr)|v158(VarCurr,bitIndex6)|$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2813(VarCurr)|v158(VarCurr,bitIndex6)|$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2813(VarCurr)| -v158(VarCurr,bitIndex6)| -$T|v158(VarCurr,bitIndex5)|$F|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2813(VarCurr)| -v158(VarCurr,bitIndex6)| -$T| -v158(VarCurr,bitIndex5)| -$F| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] b1000000(bitIndex6).
% 94.37/93.74  0 [] -b1000000(bitIndex5).
% 94.37/93.74  0 [] -b1000000(bitIndex4).
% 94.37/93.74  0 [] -b1000000(bitIndex3).
% 94.37/93.74  0 [] -b1000000(bitIndex2).
% 94.37/93.74  0 [] -b1000000(bitIndex1).
% 94.37/93.74  0 [] -b1000000(bitIndex0).
% 94.37/93.74  0 [] -v2810(VarCurr)|v2811(VarCurr).
% 94.37/93.74  0 [] -v2810(VarCurr)|v170(VarCurr).
% 94.37/93.74  0 [] v2810(VarCurr)| -v2811(VarCurr)| -v170(VarCurr).
% 94.37/93.74  0 [] v2811(VarCurr)|v145(VarCurr,bitIndex0).
% 94.37/93.74  0 [] -v2811(VarCurr)| -v145(VarCurr,bitIndex0).
% 94.37/93.74  0 [] -v2803(VarCurr)|v2424(VarCurr).
% 94.37/93.74  0 [] v2803(VarCurr)| -v2424(VarCurr).
% 94.37/93.74  0 [] -range_2_1(B)| -v145(VarCurr,B)|v147(VarCurr,B).
% 94.37/93.74  0 [] -range_2_1(B)|v145(VarCurr,B)| -v147(VarCurr,B).
% 94.37/93.74  0 [] -range_2_1(B)|bitIndex1=B|bitIndex2=B.
% 94.37/93.74  0 [] range_2_1(B)|bitIndex1!=B.
% 94.37/93.74  0 [] range_2_1(B)|bitIndex2!=B.
% 94.37/93.74  0 [] -v147(VarCurr,bitIndex2)|v149(VarCurr,bitIndex14).
% 94.37/93.74  0 [] v147(VarCurr,bitIndex2)| -v149(VarCurr,bitIndex14).
% 94.37/93.74  0 [] -v147(VarCurr,bitIndex1)|v149(VarCurr,bitIndex13).
% 94.37/93.74  0 [] v147(VarCurr,bitIndex1)| -v149(VarCurr,bitIndex13).
% 94.37/93.74  0 [] -range_14_13(B)| -v149(VarCurr,B)|v151(VarCurr,B).
% 94.37/93.74  0 [] -range_14_13(B)|v149(VarCurr,B)| -v151(VarCurr,B).
% 94.37/93.74  0 [] -range_14_13(B)| -v151(VarCurr,B)|v156(VarCurr,B).
% 94.37/93.74  0 [] -range_14_13(B)|v151(VarCurr,B)| -v156(VarCurr,B).
% 94.37/93.74  0 [] -range_14_13(B)|bitIndex13=B|bitIndex14=B.
% 94.37/93.74  0 [] range_14_13(B)|bitIndex13!=B.
% 94.37/93.74  0 [] range_14_13(B)|bitIndex14!=B.
% 94.37/93.74  0 [] -v2787(VarCurr)|v2789(VarCurr).
% 94.37/93.74  0 [] v2787(VarCurr)| -v2789(VarCurr).
% 94.37/93.74  0 [] -v2789(VarCurr)|v2791(VarCurr).
% 94.37/93.74  0 [] v2789(VarCurr)| -v2791(VarCurr).
% 94.37/93.74  0 [] -v2791(VarCurr)|v2412(VarCurr).
% 94.37/93.74  0 [] v2791(VarCurr)| -v2412(VarCurr).
% 94.37/93.74  0 [] -v2783(VarCurr)|v125(VarCurr).
% 94.37/93.74  0 [] v2783(VarCurr)| -v125(VarCurr).
% 94.37/93.74  0 [] -v2294(VarCurr)|v2296(VarCurr).
% 94.37/93.74  0 [] v2294(VarCurr)| -v2296(VarCurr).
% 94.37/93.74  0 [] -v2296(VarCurr)|v2298(VarCurr).
% 94.37/93.74  0 [] v2296(VarCurr)| -v2298(VarCurr).
% 94.37/93.74  0 [] -v2298(VarCurr)|v2300(VarCurr).
% 94.37/93.74  0 [] v2298(VarCurr)| -v2300(VarCurr).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2756(VarNext)| -v2300(VarNext)|v2300(VarCurr).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2756(VarNext)|v2300(VarNext)| -v2300(VarCurr).
% 94.37/93.74  0 [] -v2756(VarNext)| -v2300(VarNext)|v2764(VarNext).
% 94.37/93.74  0 [] -v2756(VarNext)|v2300(VarNext)| -v2764(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)| -v2764(VarNext)|v2762(VarCurr).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2764(VarNext)| -v2762(VarCurr).
% 94.37/93.74  0 [] v2765(VarCurr)| -v2762(VarCurr)|v2766(VarCurr).
% 94.37/93.74  0 [] v2765(VarCurr)|v2762(VarCurr)| -v2766(VarCurr).
% 94.37/93.74  0 [] -v2765(VarCurr)| -v2762(VarCurr)|$F.
% 94.37/93.74  0 [] -v2765(VarCurr)|v2762(VarCurr)| -$F.
% 94.37/93.74  0 [] -v2766(VarCurr)|v2767(VarCurr)|v2741(VarCurr).
% 94.37/93.74  0 [] v2766(VarCurr)| -v2767(VarCurr).
% 94.37/93.74  0 [] v2766(VarCurr)| -v2741(VarCurr).
% 94.37/93.74  0 [] -v2767(VarCurr)|v2768(VarCurr)|v2302(VarCurr,bitIndex12).
% 94.37/93.74  0 [] v2767(VarCurr)| -v2768(VarCurr).
% 94.37/93.74  0 [] v2767(VarCurr)| -v2302(VarCurr,bitIndex12).
% 94.37/93.74  0 [] -v2768(VarCurr)|v2769(VarCurr)|v2418(VarCurr).
% 94.37/93.74  0 [] v2768(VarCurr)| -v2769(VarCurr).
% 94.37/93.74  0 [] v2768(VarCurr)| -v2418(VarCurr).
% 94.37/93.74  0 [] -v2769(VarCurr)|v2770(VarCurr)|v2412(VarCurr).
% 94.37/93.74  0 [] v2769(VarCurr)| -v2770(VarCurr).
% 94.37/93.74  0 [] v2769(VarCurr)| -v2412(VarCurr).
% 94.37/93.74  0 [] -v2770(VarCurr)|v2771(VarCurr)|v2302(VarCurr,bitIndex9).
% 94.37/93.74  0 [] v2770(VarCurr)| -v2771(VarCurr).
% 94.37/93.74  0 [] v2770(VarCurr)| -v2302(VarCurr,bitIndex9).
% 94.37/93.74  0 [] -v2771(VarCurr)|v2302(VarCurr,bitIndex3)|v2302(VarCurr,bitIndex6).
% 94.37/93.74  0 [] v2771(VarCurr)| -v2302(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2771(VarCurr)| -v2302(VarCurr,bitIndex6).
% 94.37/93.74  0 [] v2765(VarCurr)|v123(VarCurr).
% 94.37/93.74  0 [] -v2765(VarCurr)| -v123(VarCurr).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)| -v2756(VarNext)|v2757(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2756(VarNext)| -v2757(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)| -v2757(VarNext)|v2758(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)| -v2757(VarNext)|v193(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2757(VarNext)| -v2758(VarNext)| -v193(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)|v2758(VarNext)|v204(VarNext).
% 94.37/93.74  0 [] -nextState(VarCurr,VarNext)| -v2758(VarNext)| -v204(VarNext).
% 94.37/93.74  0 [] -v2741(VarCurr)|v2752(VarCurr).
% 94.37/93.74  0 [] -v2741(VarCurr)|v2753(VarCurr).
% 94.37/93.74  0 [] v2741(VarCurr)| -v2752(VarCurr)| -v2753(VarCurr).
% 94.37/93.74  0 [] v2753(VarCurr)|v2322(VarCurr).
% 94.37/93.74  0 [] -v2753(VarCurr)| -v2322(VarCurr).
% 94.37/93.74  0 [] -v2752(VarCurr)|v2304(VarCurr).
% 94.37/93.74  0 [] -v2752(VarCurr)|v2743(VarCurr).
% 94.37/93.74  0 [] v2752(VarCurr)| -v2304(VarCurr)| -v2743(VarCurr).
% 94.37/93.74  0 [] -v2743(VarCurr)|v2745(VarCurr).
% 94.37/93.74  0 [] v2743(VarCurr)| -v2745(VarCurr).
% 94.37/93.74  0 [] -v2745(VarCurr)|v2747(VarCurr).
% 94.37/93.74  0 [] v2745(VarCurr)| -v2747(VarCurr).
% 94.37/93.74  0 [] v2750(VarCurr)| -v2747(VarCurr)|$F.
% 94.37/93.74  0 [] v2750(VarCurr)|v2747(VarCurr)| -$F.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v2747(VarCurr)|$T.
% 94.37/93.74  0 [] -v2750(VarCurr)|v2747(VarCurr)| -$T.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex1)|$T.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex1)| -$T.
% 94.37/93.74  0 [] -v2750(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.37/93.74  0 [] -v2750(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.37/93.74  0 [] v2750(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$T|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2750(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$T|v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2750(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$T| -v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] v2750(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$T| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.37/93.74  0 [] -b0001010(bitIndex6).
% 94.37/93.74  0 [] -b0001010(bitIndex5).
% 94.37/93.74  0 [] -b0001010(bitIndex4).
% 94.37/93.74  0 [] b0001010(bitIndex3).
% 94.37/93.74  0 [] -b0001010(bitIndex2).
% 94.37/93.74  0 [] b0001010(bitIndex1).
% 94.37/93.74  0 [] -b0001010(bitIndex0).
% 94.37/93.74  0 [] v2737(VarCurr)| -v2302(VarCurr,bitIndex12)|$F.
% 94.37/93.74  0 [] v2737(VarCurr)|v2302(VarCurr,bitIndex12)| -$F.
% 94.37/93.74  0 [] -v2737(VarCurr)| -v2302(VarCurr,bitIndex12)|$T.
% 94.37/93.74  0 [] -v2737(VarCurr)|v2302(VarCurr,bitIndex12)| -$T.
% 94.37/93.74  0 [] -v2737(VarCurr)|v2738(VarCurr).
% 94.37/93.74  0 [] -v2737(VarCurr)|v2739(VarCurr).
% 94.37/93.74  0 [] v2737(VarCurr)| -v2738(VarCurr)| -v2739(VarCurr).
% 94.37/93.74  0 [] -v2739(VarCurr)| -$T|v2397(VarCurr,bitIndex11).
% 94.37/93.74  0 [] -v2739(VarCurr)|$T| -v2397(VarCurr,bitIndex11).
% 94.37/93.74  0 [] v2739(VarCurr)|$T|v2397(VarCurr,bitIndex11).
% 94.37/93.74  0 [] v2739(VarCurr)| -$T| -v2397(VarCurr,bitIndex11).
% 94.37/93.74  0 [] -v2738(VarCurr)|v2365(VarCurr).
% 94.37/93.74  0 [] -v2738(VarCurr)|v2304(VarCurr).
% 94.37/93.74  0 [] v2738(VarCurr)| -v2365(VarCurr)| -v2304(VarCurr).
% 94.37/93.74  0 [] -v2418(VarCurr)|v2420(VarCurr)|v2732(VarCurr).
% 94.37/93.74  0 [] v2418(VarCurr)| -v2420(VarCurr).
% 94.37/93.74  0 [] v2418(VarCurr)| -v2732(VarCurr).
% 94.37/93.74  0 [] -v2732(VarCurr)|v2734(VarCurr).
% 94.37/93.74  0 [] -v2732(VarCurr)|v2426(VarCurr).
% 94.37/93.74  0 [] v2732(VarCurr)| -v2734(VarCurr)| -v2426(VarCurr).
% 94.37/93.74  0 [] v2734(VarCurr)|v2422(VarCurr).
% 94.37/93.74  0 [] -v2734(VarCurr)| -v2422(VarCurr).
% 94.37/93.74  0 [] -v2420(VarCurr)|v2730(VarCurr).
% 94.37/93.74  0 [] -v2420(VarCurr)|v2441(VarCurr).
% 94.37/93.74  0 [] v2420(VarCurr)| -v2730(VarCurr)| -v2441(VarCurr).
% 94.37/93.74  0 [] -v2730(VarCurr)|v2422(VarCurr).
% 94.37/93.74  0 [] -v2730(VarCurr)|v2426(VarCurr).
% 94.37/93.74  0 [] v2730(VarCurr)| -v2422(VarCurr)| -v2426(VarCurr).
% 94.37/93.74  0 [] -v2441(VarCurr)|v2443(VarCurr).
% 94.37/93.74  0 [] v2441(VarCurr)| -v2443(VarCurr).
% 94.37/93.74  0 [] -v2443(VarCurr)|v2445(VarCurr).
% 94.37/93.74  0 [] v2443(VarCurr)| -v2445(VarCurr).
% 94.37/93.74  0 [] -v2445(VarCurr)|v2722(VarCurr).
% 94.37/93.74  0 [] -v2445(VarCurr)|v2447(VarCurr,bitIndex8).
% 94.37/93.74  0 [] v2445(VarCurr)| -v2722(VarCurr)| -v2447(VarCurr,bitIndex8).
% 94.37/93.74  0 [] -v2722(VarCurr)|v2723(VarCurr).
% 94.37/93.74  0 [] -v2722(VarCurr)|v2447(VarCurr,bitIndex7).
% 94.37/93.74  0 [] v2722(VarCurr)| -v2723(VarCurr)| -v2447(VarCurr,bitIndex7).
% 94.37/93.74  0 [] -v2723(VarCurr)|v2724(VarCurr).
% 94.37/93.74  0 [] -v2723(VarCurr)|v2447(VarCurr,bitIndex6).
% 94.37/93.74  0 [] v2723(VarCurr)| -v2724(VarCurr)| -v2447(VarCurr,bitIndex6).
% 94.37/93.74  0 [] -v2724(VarCurr)|v2725(VarCurr).
% 94.37/93.74  0 [] -v2724(VarCurr)|v2447(VarCurr,bitIndex5).
% 94.37/93.74  0 [] v2724(VarCurr)| -v2725(VarCurr)| -v2447(VarCurr,bitIndex5).
% 94.37/93.74  0 [] -v2725(VarCurr)|v2726(VarCurr).
% 94.37/93.74  0 [] -v2725(VarCurr)|v2447(VarCurr,bitIndex4).
% 94.37/93.74  0 [] v2725(VarCurr)| -v2726(VarCurr)| -v2447(VarCurr,bitIndex4).
% 94.37/93.74  0 [] -v2726(VarCurr)|v2727(VarCurr).
% 94.37/93.74  0 [] -v2726(VarCurr)|v2447(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2726(VarCurr)| -v2727(VarCurr)| -v2447(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2727(VarCurr)|v2728(VarCurr).
% 94.37/93.74  0 [] -v2727(VarCurr)|v2447(VarCurr,bitIndex2).
% 94.37/93.74  0 [] v2727(VarCurr)| -v2728(VarCurr)| -v2447(VarCurr,bitIndex2).
% 94.37/93.74  0 [] -v2728(VarCurr)|v2447(VarCurr,bitIndex0).
% 94.37/93.74  0 [] -v2728(VarCurr)|v2447(VarCurr,bitIndex1).
% 94.37/93.74  0 [] v2728(VarCurr)| -v2447(VarCurr,bitIndex0)| -v2447(VarCurr,bitIndex1).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex8)|v2655(VarCurr,bitIndex16).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex8)| -v2655(VarCurr,bitIndex16).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex7)|v2655(VarCurr,bitIndex15).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex7)| -v2655(VarCurr,bitIndex15).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex6)|v2655(VarCurr,bitIndex14).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex6)| -v2655(VarCurr,bitIndex14).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex5)|v2655(VarCurr,bitIndex13).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex5)| -v2655(VarCurr,bitIndex13).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex4)|v2655(VarCurr,bitIndex12).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex4)| -v2655(VarCurr,bitIndex12).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex3)|v2655(VarCurr,bitIndex11).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex3)| -v2655(VarCurr,bitIndex11).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex2)|v2655(VarCurr,bitIndex10).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex2)| -v2655(VarCurr,bitIndex10).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex1)|v2655(VarCurr,bitIndex9).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex1)| -v2655(VarCurr,bitIndex9).
% 94.37/93.74  0 [] -v2447(VarCurr,bitIndex0)|v2655(VarCurr,bitIndex8).
% 94.37/93.74  0 [] v2447(VarCurr,bitIndex0)| -v2655(VarCurr,bitIndex8).
% 94.37/93.74  0 [] -range_16_0(B)| -v2655(VarCurr,B)|v2657(VarCurr,B)|v2717(VarCurr,B).
% 94.37/93.74  0 [] -range_16_0(B)|v2655(VarCurr,B)| -v2657(VarCurr,B).
% 94.37/93.74  0 [] -range_16_0(B)|v2655(VarCurr,B)| -v2717(VarCurr,B).
% 94.37/93.74  0 [] -range_16_0(B)| -v2717(VarCurr,B)|v2718(VarCurr,B).
% 94.37/93.74  0 [] -range_16_0(B)| -v2717(VarCurr,B)|v2719(VarCurr,B).
% 94.37/93.74  0 [] -range_16_0(B)|v2717(VarCurr,B)| -v2718(VarCurr,B)| -v2719(VarCurr,B).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex0)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex0)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex1)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex1)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex2)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex2)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex3)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex3)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex4)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex4)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex5)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex5)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex6)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex6)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex7)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex7)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex8)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex8)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex9)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex9)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex10)|v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] v2719(VarCurr,bitIndex10)| -v2667(VarCurr,bitIndex3).
% 94.37/93.74  0 [] -v2719(VarCurr,bitIndex11)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex11)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2719(VarCurr,bitIndex12)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex12)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2719(VarCurr,bitIndex13)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex13)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2719(VarCurr,bitIndex14)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex14)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2719(VarCurr,bitIndex15)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex15)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2719(VarCurr,bitIndex16)|v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2719(VarCurr,bitIndex16)| -v2667(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -range_7_0(B)| -v2718(VarCurr,B)|$F.
% 94.37/93.75  0 [] -range_7_0(B)|v2718(VarCurr,B)| -$F.
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex16)|v2658(VarCurr,bitIndex8).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex16)| -v2658(VarCurr,bitIndex8).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex15)|v2658(VarCurr,bitIndex7).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex15)| -v2658(VarCurr,bitIndex7).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex14)|v2658(VarCurr,bitIndex6).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex14)| -v2658(VarCurr,bitIndex6).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex13)|v2658(VarCurr,bitIndex5).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex13)| -v2658(VarCurr,bitIndex5).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex12)|v2658(VarCurr,bitIndex4).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex12)| -v2658(VarCurr,bitIndex4).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex11)|v2658(VarCurr,bitIndex3).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex11)| -v2658(VarCurr,bitIndex3).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex10)|v2658(VarCurr,bitIndex2).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex10)| -v2658(VarCurr,bitIndex2).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex9)|v2658(VarCurr,bitIndex1).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex9)| -v2658(VarCurr,bitIndex1).
% 94.37/93.75  0 [] -v2718(VarCurr,bitIndex8)|v2658(VarCurr,bitIndex0).
% 94.37/93.75  0 [] v2718(VarCurr,bitIndex8)| -v2658(VarCurr,bitIndex0).
% 94.37/93.75  0 [] -range_16_0(B)| -v2657(VarCurr,B)|v2658(VarCurr,B).
% 94.37/93.75  0 [] -range_16_0(B)| -v2657(VarCurr,B)|v2715(VarCurr,B).
% 94.37/93.75  0 [] -range_16_0(B)|v2657(VarCurr,B)| -v2658(VarCurr,B)| -v2715(VarCurr,B).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex0)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex0)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex1)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex1)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex2)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex2)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex3)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex3)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex4)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex4)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex5)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex5)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex6)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex6)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex7)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex7)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex8)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex8)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex9)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex9)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex10)|v2716(VarCurr).
% 94.37/93.75  0 [] v2715(VarCurr,bitIndex10)| -v2716(VarCurr).
% 94.37/93.75  0 [] -v2715(VarCurr,bitIndex11)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex11)| -v2716(VarCurr).
% 94.39/93.75  0 [] -v2715(VarCurr,bitIndex12)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex12)| -v2716(VarCurr).
% 94.39/93.75  0 [] -v2715(VarCurr,bitIndex13)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex13)| -v2716(VarCurr).
% 94.39/93.75  0 [] -v2715(VarCurr,bitIndex14)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex14)| -v2716(VarCurr).
% 94.39/93.75  0 [] -v2715(VarCurr,bitIndex15)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex15)| -v2716(VarCurr).
% 94.39/93.75  0 [] -v2715(VarCurr,bitIndex16)|v2716(VarCurr).
% 94.39/93.75  0 [] v2715(VarCurr,bitIndex16)| -v2716(VarCurr).
% 94.39/93.75  0 [] v2716(VarCurr)|v2667(VarCurr,bitIndex3).
% 94.39/93.75  0 [] -v2716(VarCurr)| -v2667(VarCurr,bitIndex3).
% 94.39/93.75  0 [] -range_16_0(B)| -v2658(VarCurr,B)|v2659(VarCurr,B)|v2712(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2658(VarCurr,B)| -v2659(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2658(VarCurr,B)| -v2712(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)| -v2712(VarCurr,B)|v2713(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)| -v2712(VarCurr,B)|v2714(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2712(VarCurr,B)| -v2713(VarCurr,B)| -v2714(VarCurr,B).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex0)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex0)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex1)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex1)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex2)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex2)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex3)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex3)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex4)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex4)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex5)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex5)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex6)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex6)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex7)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex7)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex8)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex8)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex9)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex9)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex10)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex10)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex11)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex11)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex12)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex12)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex13)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex13)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex14)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex14)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex15)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex15)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2714(VarCurr,bitIndex16)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2714(VarCurr,bitIndex16)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -range_3_0(B)| -v2713(VarCurr,B)|$F.
% 94.39/93.75  0 [] -range_3_0(B)|v2713(VarCurr,B)| -$F.
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex16)|v2660(VarCurr,bitIndex12).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex16)| -v2660(VarCurr,bitIndex12).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex15)|v2660(VarCurr,bitIndex11).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex15)| -v2660(VarCurr,bitIndex11).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex14)|v2660(VarCurr,bitIndex10).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex14)| -v2660(VarCurr,bitIndex10).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex13)|v2660(VarCurr,bitIndex9).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex13)| -v2660(VarCurr,bitIndex9).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex12)|v2660(VarCurr,bitIndex8).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex12)| -v2660(VarCurr,bitIndex8).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex11)|v2660(VarCurr,bitIndex7).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex11)| -v2660(VarCurr,bitIndex7).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex10)|v2660(VarCurr,bitIndex6).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex10)| -v2660(VarCurr,bitIndex6).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex9)|v2660(VarCurr,bitIndex5).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex9)| -v2660(VarCurr,bitIndex5).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex8)|v2660(VarCurr,bitIndex4).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex8)| -v2660(VarCurr,bitIndex4).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex7)|v2660(VarCurr,bitIndex3).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex7)| -v2660(VarCurr,bitIndex3).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex6)|v2660(VarCurr,bitIndex2).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex6)| -v2660(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex5)|v2660(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex5)| -v2660(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2713(VarCurr,bitIndex4)|v2660(VarCurr,bitIndex0).
% 94.39/93.75  0 [] v2713(VarCurr,bitIndex4)| -v2660(VarCurr,bitIndex0).
% 94.39/93.75  0 [] -range_16_0(B)| -v2659(VarCurr,B)|v2660(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)| -v2659(VarCurr,B)|v2710(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2659(VarCurr,B)| -v2660(VarCurr,B)| -v2710(VarCurr,B).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex0)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex0)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex1)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex1)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex2)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex2)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex3)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex3)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex4)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex4)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex5)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex5)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex6)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex6)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex7)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex7)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex8)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex8)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex9)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex9)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex10)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex10)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex11)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex11)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex12)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex12)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex13)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex13)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex14)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex14)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex15)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex15)| -v2711(VarCurr).
% 94.39/93.75  0 [] -v2710(VarCurr,bitIndex16)|v2711(VarCurr).
% 94.39/93.75  0 [] v2710(VarCurr,bitIndex16)| -v2711(VarCurr).
% 94.39/93.75  0 [] v2711(VarCurr)|v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -v2711(VarCurr)| -v2667(VarCurr,bitIndex2).
% 94.39/93.75  0 [] -range_16_0(B)| -v2660(VarCurr,B)|v2661(VarCurr,B)|v2707(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2660(VarCurr,B)| -v2661(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2660(VarCurr,B)| -v2707(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)| -v2707(VarCurr,B)|v2708(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)| -v2707(VarCurr,B)|v2709(VarCurr,B).
% 94.39/93.75  0 [] -range_16_0(B)|v2707(VarCurr,B)| -v2708(VarCurr,B)| -v2709(VarCurr,B).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex0)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex0)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex1)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex1)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex2)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex2)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex3)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex3)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex4)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex4)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex5)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex5)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex6)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex6)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex7)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex7)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex8)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex8)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex9)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex9)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex10)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex10)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex11)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex11)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex12)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex12)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex13)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex13)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex14)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex14)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex15)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex15)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -v2709(VarCurr,bitIndex16)|v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] v2709(VarCurr,bitIndex16)| -v2667(VarCurr,bitIndex1).
% 94.39/93.75  0 [] -range_1_0(B)| -v2708(VarCurr,B)|$F.
% 94.39/93.75  0 [] -range_1_0(B)|v2708(VarCurr,B)| -$F.
% 94.39/93.75  0 [] -v2708(VarCurr,bitIndex16)|v2662(VarCurr,bitIndex14).
% 94.39/93.75  0 [] v2708(VarCurr,bitIndex16)| -v2662(VarCurr,bitIndex14).
% 94.39/93.75  0 [] -v2708(VarCurr,bitIndex15)|v2662(VarCurr,bitIndex13).
% 94.39/93.75  0 [] v2708(VarCurr,bitIndex15)| -v2662(VarCurr,bitIndex13).
% 94.39/93.75  0 [] -v2708(VarCurr,bitIndex14)|v2662(VarCurr,bitIndex12).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex14)| -v2662(VarCurr,bitIndex12).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex13)|v2662(VarCurr,bitIndex11).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex13)| -v2662(VarCurr,bitIndex11).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex12)|v2662(VarCurr,bitIndex10).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex12)| -v2662(VarCurr,bitIndex10).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex11)|v2662(VarCurr,bitIndex9).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex11)| -v2662(VarCurr,bitIndex9).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex10)|v2662(VarCurr,bitIndex8).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex10)| -v2662(VarCurr,bitIndex8).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex9)|v2662(VarCurr,bitIndex7).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex9)| -v2662(VarCurr,bitIndex7).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex8)|v2662(VarCurr,bitIndex6).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex8)| -v2662(VarCurr,bitIndex6).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex7)|v2662(VarCurr,bitIndex5).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex7)| -v2662(VarCurr,bitIndex5).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex6)|v2662(VarCurr,bitIndex4).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex6)| -v2662(VarCurr,bitIndex4).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex5)|v2662(VarCurr,bitIndex3).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex5)| -v2662(VarCurr,bitIndex3).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex4)|v2662(VarCurr,bitIndex2).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex4)| -v2662(VarCurr,bitIndex2).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex3)|v2662(VarCurr,bitIndex1).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex3)| -v2662(VarCurr,bitIndex1).
% 94.39/93.76  0 [] -v2708(VarCurr,bitIndex2)|v2662(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2708(VarCurr,bitIndex2)| -v2662(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -range_16_0(B)| -v2661(VarCurr,B)|v2662(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)| -v2661(VarCurr,B)|v2705(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|v2661(VarCurr,B)| -v2662(VarCurr,B)| -v2705(VarCurr,B).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex0)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex0)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex1)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex1)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex2)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex2)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex3)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex3)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex4)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex4)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex5)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex5)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex6)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex6)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex7)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex7)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex8)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex8)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex9)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex9)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex10)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex10)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex11)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex11)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex12)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex12)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex13)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex13)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex14)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex14)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex15)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex15)| -v2706(VarCurr).
% 94.39/93.76  0 [] -v2705(VarCurr,bitIndex16)|v2706(VarCurr).
% 94.39/93.76  0 [] v2705(VarCurr,bitIndex16)| -v2706(VarCurr).
% 94.39/93.76  0 [] v2706(VarCurr)|v2667(VarCurr,bitIndex1).
% 94.39/93.76  0 [] -v2706(VarCurr)| -v2667(VarCurr,bitIndex1).
% 94.39/93.76  0 [] -range_16_0(B)| -v2662(VarCurr,B)|v2663(VarCurr,B)|v2702(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|v2662(VarCurr,B)| -v2663(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|v2662(VarCurr,B)| -v2702(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)| -v2702(VarCurr,B)|v2703(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)| -v2702(VarCurr,B)|v2704(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|v2702(VarCurr,B)| -v2703(VarCurr,B)| -v2704(VarCurr,B).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex0)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex0)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex1)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex1)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex2)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex2)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex3)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex3)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex4)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex4)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex5)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex5)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex6)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex6)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex7)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex7)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex8)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex8)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex9)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex9)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex10)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex10)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex11)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex11)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex12)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex12)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex13)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex13)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex14)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex14)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex15)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex15)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2704(VarCurr,bitIndex16)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2704(VarCurr,bitIndex16)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex0)|$F.
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex0)| -$F.
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex16)|v2664(VarCurr,bitIndex15).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex16)| -v2664(VarCurr,bitIndex15).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex15)|v2664(VarCurr,bitIndex14).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex15)| -v2664(VarCurr,bitIndex14).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex14)|v2664(VarCurr,bitIndex13).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex14)| -v2664(VarCurr,bitIndex13).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex13)|v2664(VarCurr,bitIndex12).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex13)| -v2664(VarCurr,bitIndex12).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex12)|v2664(VarCurr,bitIndex11).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex12)| -v2664(VarCurr,bitIndex11).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex11)|v2664(VarCurr,bitIndex10).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex11)| -v2664(VarCurr,bitIndex10).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex10)|v2664(VarCurr,bitIndex9).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex10)| -v2664(VarCurr,bitIndex9).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex9)|v2664(VarCurr,bitIndex8).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex9)| -v2664(VarCurr,bitIndex8).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex8)|v2664(VarCurr,bitIndex7).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex8)| -v2664(VarCurr,bitIndex7).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex7)|v2664(VarCurr,bitIndex6).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex7)| -v2664(VarCurr,bitIndex6).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex6)|v2664(VarCurr,bitIndex5).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex6)| -v2664(VarCurr,bitIndex5).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex5)|v2664(VarCurr,bitIndex4).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex5)| -v2664(VarCurr,bitIndex4).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex4)|v2664(VarCurr,bitIndex3).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex4)| -v2664(VarCurr,bitIndex3).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex3)|v2664(VarCurr,bitIndex2).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex3)| -v2664(VarCurr,bitIndex2).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex2)|v2664(VarCurr,bitIndex1).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex2)| -v2664(VarCurr,bitIndex1).
% 94.39/93.76  0 [] -v2703(VarCurr,bitIndex1)|v2664(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2703(VarCurr,bitIndex1)| -v2664(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -range_16_0(B)| -v2663(VarCurr,B)|v2664(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)| -v2663(VarCurr,B)|v2665(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|v2663(VarCurr,B)| -v2664(VarCurr,B)| -v2665(VarCurr,B).
% 94.39/93.76  0 [] -range_16_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex0!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex1!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex2!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex3!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex4!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex5!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex6!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex7!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex8!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex9!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex10!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex11!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex12!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex13!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex14!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex15!=B.
% 94.39/93.76  0 [] range_16_0(B)|bitIndex16!=B.
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex0)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex0)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex1)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex1)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex2)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex2)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex3)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex3)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex4)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex4)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex5)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex5)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex6)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex6)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex7)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex7)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex8)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex8)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex9)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex9)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex10)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex10)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex11)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex11)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex12)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex12)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex13)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex13)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex14)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex14)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex15)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex15)| -v2666(VarCurr).
% 94.39/93.76  0 [] -v2665(VarCurr,bitIndex16)|v2666(VarCurr).
% 94.39/93.76  0 [] v2665(VarCurr,bitIndex16)| -v2666(VarCurr).
% 94.39/93.76  0 [] v2666(VarCurr)|v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2666(VarCurr)| -v2667(VarCurr,bitIndex0).
% 94.39/93.76  0 [] -v2667(VarCurr,bitIndex0)|v2676(VarCurr).
% 94.39/93.76  0 [] v2667(VarCurr,bitIndex0)| -v2676(VarCurr).
% 94.39/93.76  0 [] -v2667(VarCurr,bitIndex1)|v2698(VarCurr).
% 94.39/93.76  0 [] v2667(VarCurr,bitIndex1)| -v2698(VarCurr).
% 94.39/93.76  0 [] -v2667(VarCurr,bitIndex2)|v2693(VarCurr).
% 94.39/93.76  0 [] v2667(VarCurr,bitIndex2)| -v2693(VarCurr).
% 94.39/93.76  0 [] -v2667(VarCurr,bitIndex3)|v2669(VarCurr).
% 94.39/93.76  0 [] v2667(VarCurr,bitIndex3)| -v2669(VarCurr).
% 94.39/93.76  0 [] -v2698(VarCurr)|v2699(VarCurr).
% 94.39/93.76  0 [] -v2698(VarCurr)|v2701(VarCurr).
% 94.39/93.76  0 [] v2698(VarCurr)| -v2699(VarCurr)| -v2701(VarCurr).
% 94.39/93.76  0 [] -v2701(VarCurr)|v2652(VarCurr,bitIndex0)|v2688(VarCurr).
% 94.39/93.76  0 [] v2701(VarCurr)| -v2652(VarCurr,bitIndex0).
% 94.39/93.76  0 [] v2701(VarCurr)| -v2688(VarCurr).
% 94.39/93.76  0 [] -v2699(VarCurr)|v2676(VarCurr)|v2700(VarCurr).
% 94.39/93.76  0 [] v2699(VarCurr)| -v2676(VarCurr).
% 94.39/93.76  0 [] v2699(VarCurr)| -v2700(VarCurr).
% 94.39/93.76  0 [] v2700(VarCurr)|v2688(VarCurr).
% 94.39/93.76  0 [] -v2700(VarCurr)| -v2688(VarCurr).
% 94.39/93.76  0 [] -v2693(VarCurr)|v2694(VarCurr).
% 94.39/93.76  0 [] -v2693(VarCurr)|v2697(VarCurr).
% 94.39/93.76  0 [] v2693(VarCurr)| -v2694(VarCurr)| -v2697(VarCurr).
% 94.39/93.76  0 [] -v2697(VarCurr)|v2683(VarCurr)|v2687(VarCurr).
% 94.39/93.76  0 [] v2697(VarCurr)| -v2683(VarCurr).
% 94.39/93.76  0 [] v2697(VarCurr)| -v2687(VarCurr).
% 94.39/93.76  0 [] -v2694(VarCurr)|v2695(VarCurr)|v2696(VarCurr).
% 94.39/93.76  0 [] v2694(VarCurr)| -v2695(VarCurr).
% 94.39/93.76  0 [] v2694(VarCurr)| -v2696(VarCurr).
% 94.39/93.76  0 [] v2696(VarCurr)|v2687(VarCurr).
% 94.39/93.76  0 [] -v2696(VarCurr)| -v2687(VarCurr).
% 94.39/93.76  0 [] v2695(VarCurr)|v2683(VarCurr).
% 94.39/93.76  0 [] -v2695(VarCurr)| -v2683(VarCurr).
% 94.39/93.76  0 [] -v2669(VarCurr)|v2670(VarCurr).
% 94.39/93.76  0 [] -v2669(VarCurr)|v2691(VarCurr).
% 94.39/93.76  0 [] v2669(VarCurr)| -v2670(VarCurr)| -v2691(VarCurr).
% 94.39/93.76  0 [] -v2691(VarCurr)|v2692(VarCurr)|v2682(VarCurr).
% 94.39/93.76  0 [] v2691(VarCurr)| -v2692(VarCurr).
% 94.39/93.76  0 [] v2691(VarCurr)| -v2682(VarCurr).
% 94.39/93.76  0 [] v2692(VarCurr)|v2671(VarCurr).
% 94.39/93.76  0 [] -v2692(VarCurr)| -v2671(VarCurr).
% 94.39/93.76  0 [] -v2670(VarCurr)|v2671(VarCurr)|v2681(VarCurr).
% 94.39/93.76  0 [] v2670(VarCurr)| -v2671(VarCurr).
% 94.39/93.76  0 [] v2670(VarCurr)| -v2681(VarCurr).
% 94.39/93.76  0 [] v2681(VarCurr)|v2682(VarCurr).
% 94.39/93.77  0 [] -v2681(VarCurr)| -v2682(VarCurr).
% 94.39/93.77  0 [] -v2682(VarCurr)|v2683(VarCurr).
% 94.39/93.77  0 [] -v2682(VarCurr)|v2687(VarCurr).
% 94.39/93.77  0 [] v2682(VarCurr)| -v2683(VarCurr)| -v2687(VarCurr).
% 94.39/93.77  0 [] -v2687(VarCurr)|v2652(VarCurr,bitIndex0).
% 94.39/93.77  0 [] -v2687(VarCurr)|v2688(VarCurr).
% 94.39/93.77  0 [] v2687(VarCurr)| -v2652(VarCurr,bitIndex0)| -v2688(VarCurr).
% 94.39/93.77  0 [] -v2688(VarCurr)|v2689(VarCurr).
% 94.39/93.77  0 [] -v2688(VarCurr)|v2690(VarCurr).
% 94.39/93.77  0 [] v2688(VarCurr)| -v2689(VarCurr)| -v2690(VarCurr).
% 94.39/93.77  0 [] -v2690(VarCurr)|v2676(VarCurr)|v2677(VarCurr).
% 94.39/93.77  0 [] v2690(VarCurr)| -v2676(VarCurr).
% 94.39/93.77  0 [] v2690(VarCurr)| -v2677(VarCurr).
% 94.39/93.77  0 [] -v2689(VarCurr)|v2652(VarCurr,bitIndex0)|v2652(VarCurr,bitIndex1).
% 94.39/93.77  0 [] v2689(VarCurr)| -v2652(VarCurr,bitIndex0).
% 94.39/93.77  0 [] v2689(VarCurr)| -v2652(VarCurr,bitIndex1).
% 94.39/93.77  0 [] -v2683(VarCurr)|v2684(VarCurr).
% 94.39/93.77  0 [] -v2683(VarCurr)|v2686(VarCurr).
% 94.39/93.77  0 [] v2683(VarCurr)| -v2684(VarCurr)| -v2686(VarCurr).
% 94.39/93.77  0 [] -v2686(VarCurr)|v2675(VarCurr)|v2678(VarCurr).
% 94.39/93.77  0 [] v2686(VarCurr)| -v2675(VarCurr).
% 94.39/93.77  0 [] v2686(VarCurr)| -v2678(VarCurr).
% 94.39/93.77  0 [] -v2684(VarCurr)|v2685(VarCurr)|v2652(VarCurr,bitIndex2).
% 94.39/93.77  0 [] v2684(VarCurr)| -v2685(VarCurr).
% 94.39/93.77  0 [] v2684(VarCurr)| -v2652(VarCurr,bitIndex2).
% 94.39/93.77  0 [] v2685(VarCurr)|v2675(VarCurr).
% 94.39/93.77  0 [] -v2685(VarCurr)| -v2675(VarCurr).
% 94.39/93.77  0 [] -v2671(VarCurr)|v2672(VarCurr).
% 94.39/93.77  0 [] -v2671(VarCurr)|v2679(VarCurr).
% 94.39/93.77  0 [] v2671(VarCurr)| -v2672(VarCurr)| -v2679(VarCurr).
% 94.39/93.77  0 [] -v2679(VarCurr)|v2674(VarCurr)|v2680(VarCurr).
% 94.39/93.77  0 [] v2679(VarCurr)| -v2674(VarCurr).
% 94.39/93.77  0 [] v2679(VarCurr)| -v2680(VarCurr).
% 94.39/93.77  0 [] v2680(VarCurr)|v2652(VarCurr,bitIndex3).
% 94.39/93.77  0 [] -v2680(VarCurr)| -v2652(VarCurr,bitIndex3).
% 94.39/93.77  0 [] -v2672(VarCurr)|v2673(VarCurr)|v2652(VarCurr,bitIndex3).
% 94.39/93.77  0 [] v2672(VarCurr)| -v2673(VarCurr).
% 94.39/93.77  0 [] v2672(VarCurr)| -v2652(VarCurr,bitIndex3).
% 94.39/93.77  0 [] v2673(VarCurr)|v2674(VarCurr).
% 94.39/93.77  0 [] -v2673(VarCurr)| -v2674(VarCurr).
% 94.39/93.77  0 [] -v2674(VarCurr)|v2675(VarCurr).
% 94.39/93.77  0 [] -v2674(VarCurr)|v2678(VarCurr).
% 94.39/93.77  0 [] v2674(VarCurr)| -v2675(VarCurr)| -v2678(VarCurr).
% 94.39/93.77  0 [] v2678(VarCurr)|v2652(VarCurr,bitIndex2).
% 94.39/93.77  0 [] -v2678(VarCurr)| -v2652(VarCurr,bitIndex2).
% 94.39/93.77  0 [] -v2675(VarCurr)|v2676(VarCurr).
% 94.39/93.77  0 [] -v2675(VarCurr)|v2677(VarCurr).
% 94.39/93.77  0 [] v2675(VarCurr)| -v2676(VarCurr)| -v2677(VarCurr).
% 94.39/93.77  0 [] v2677(VarCurr)|v2652(VarCurr,bitIndex1).
% 94.39/93.77  0 [] -v2677(VarCurr)| -v2652(VarCurr,bitIndex1).
% 94.39/93.77  0 [] v2676(VarCurr)|v2652(VarCurr,bitIndex0).
% 94.39/93.77  0 [] -v2676(VarCurr)| -v2652(VarCurr,bitIndex0).
% 94.39/93.77  0 [] -range_7_0(B)| -v2664(VarCurr,B)|$T.
% 94.39/93.77  0 [] -range_7_0(B)|v2664(VarCurr,B)| -$T.
% 94.39/93.77  0 [] -range_7_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex0!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex1!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex2!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex3!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex4!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex5!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex6!=B.
% 94.39/93.77  0 [] range_7_0(B)|bitIndex7!=B.
% 94.39/93.77  0 [] b11111111(bitIndex7).
% 94.39/93.77  0 [] b11111111(bitIndex6).
% 94.39/93.77  0 [] b11111111(bitIndex5).
% 94.39/93.77  0 [] b11111111(bitIndex4).
% 94.39/93.77  0 [] b11111111(bitIndex3).
% 94.39/93.77  0 [] b11111111(bitIndex2).
% 94.39/93.77  0 [] b11111111(bitIndex1).
% 94.39/93.77  0 [] b11111111(bitIndex0).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex16)|v2449(VarCurr,bitIndex8).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex16)| -v2449(VarCurr,bitIndex8).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex15)|v2449(VarCurr,bitIndex7).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex15)| -v2449(VarCurr,bitIndex7).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex14)|v2449(VarCurr,bitIndex6).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex14)| -v2449(VarCurr,bitIndex6).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex13)|v2449(VarCurr,bitIndex5).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex13)| -v2449(VarCurr,bitIndex5).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex12)|v2449(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex12)| -v2449(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex11)|v2449(VarCurr,bitIndex3).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex11)| -v2449(VarCurr,bitIndex3).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex10)|v2449(VarCurr,bitIndex2).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex10)| -v2449(VarCurr,bitIndex2).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex9)|v2449(VarCurr,bitIndex1).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex9)| -v2449(VarCurr,bitIndex1).
% 94.39/93.77  0 [] -v2664(VarCurr,bitIndex8)|v2449(VarCurr,bitIndex0).
% 94.39/93.77  0 [] v2664(VarCurr,bitIndex8)| -v2449(VarCurr,bitIndex0).
% 94.39/93.77  0 [] -range_3_0(B)| -v2652(VarCurr,B)|v2654(VarCurr,B).
% 94.39/93.77  0 [] -range_3_0(B)|v2652(VarCurr,B)| -v2654(VarCurr,B).
% 94.39/93.77  0 [] -v2654(VarCurr,bitIndex3)|v149(VarCurr,bitIndex8).
% 94.39/93.77  0 [] v2654(VarCurr,bitIndex3)| -v149(VarCurr,bitIndex8).
% 94.39/93.77  0 [] -v2654(VarCurr,bitIndex2)|v149(VarCurr,bitIndex7).
% 94.39/93.77  0 [] v2654(VarCurr,bitIndex2)| -v149(VarCurr,bitIndex7).
% 94.39/93.77  0 [] -v2654(VarCurr,bitIndex1)|v149(VarCurr,bitIndex6).
% 94.39/93.77  0 [] v2654(VarCurr,bitIndex1)| -v149(VarCurr,bitIndex6).
% 94.39/93.77  0 [] -v2654(VarCurr,bitIndex0)|v149(VarCurr,bitIndex5).
% 94.39/93.77  0 [] v2654(VarCurr,bitIndex0)| -v149(VarCurr,bitIndex5).
% 94.39/93.77  0 [] -range_8_5(B)| -v149(VarCurr,B)|v151(VarCurr,B).
% 94.39/93.77  0 [] -range_8_5(B)|v149(VarCurr,B)| -v151(VarCurr,B).
% 94.39/93.77  0 [] -range_8_5(B)| -v151(VarCurr,B)|v156(VarCurr,B).
% 94.39/93.77  0 [] -range_8_5(B)|v151(VarCurr,B)| -v156(VarCurr,B).
% 94.39/93.77  0 [] -range_8_5(B)|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B.
% 94.39/93.77  0 [] range_8_5(B)|bitIndex5!=B.
% 94.39/93.77  0 [] range_8_5(B)|bitIndex6!=B.
% 94.39/93.77  0 [] range_8_5(B)|bitIndex7!=B.
% 94.39/93.77  0 [] range_8_5(B)|bitIndex8!=B.
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex8)|v2454(VarCurr,bitIndex8).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex8)| -v2454(VarCurr,bitIndex8).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex7)|v2454(VarCurr,bitIndex7).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex7)| -v2454(VarCurr,bitIndex7).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex6)|v2454(VarCurr,bitIndex6).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex6)| -v2454(VarCurr,bitIndex6).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex5)|v2454(VarCurr,bitIndex5).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex5)| -v2454(VarCurr,bitIndex5).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex4)|v2454(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex4)| -v2454(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex3)|v2454(VarCurr,bitIndex3).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex3)| -v2454(VarCurr,bitIndex3).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex2)|v2454(VarCurr,bitIndex2).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex2)| -v2454(VarCurr,bitIndex2).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex1)|v2454(VarCurr,bitIndex1).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex1)| -v2454(VarCurr,bitIndex1).
% 94.39/93.77  0 [] -v2449(VarCurr,bitIndex0)|v2454(VarCurr,bitIndex0).
% 94.39/93.77  0 [] v2449(VarCurr,bitIndex0)| -v2454(VarCurr,bitIndex0).
% 94.39/93.77  0 [] -range_39_0(B)| -v2454(VarCurr,B)|v2456(VarCurr,B)|v2556(VarCurr,B).
% 94.39/93.77  0 [] -range_39_0(B)|v2454(VarCurr,B)| -v2456(VarCurr,B).
% 94.39/93.77  0 [] -range_39_0(B)|v2454(VarCurr,B)| -v2556(VarCurr,B).
% 94.39/93.77  0 [] -range_39_0(B)| -v2556(VarCurr,B)|v2557(VarCurr,B).
% 94.39/93.77  0 [] -range_39_0(B)| -v2556(VarCurr,B)|v2649(VarCurr,B).
% 94.39/93.77  0 [] -range_39_0(B)|v2556(VarCurr,B)| -v2557(VarCurr,B)| -v2649(VarCurr,B).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex38)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex38)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -v2649(VarCurr,bitIndex39)|v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] v2649(VarCurr,bitIndex39)| -v2453(VarCurr,bitIndex4).
% 94.39/93.77  0 [] -range_23_0(B)| -v2557(VarCurr,B)|v2559(VarCurr,B).
% 94.39/93.77  0 [] -range_23_0(B)|v2557(VarCurr,B)| -v2559(VarCurr,B).
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex39)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex39)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex38)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex38)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex37)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex37)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex36)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex36)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex35)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex35)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex34)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex34)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex33)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex33)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex32)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex32)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex31)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex31)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex30)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex30)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex29)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex29)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex28)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex28)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex27)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex27)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex26)|$F.
% 94.39/93.77  0 [] v2557(VarCurr,bitIndex26)| -$F.
% 94.39/93.77  0 [] -v2557(VarCurr,bitIndex25)|$F.
% 94.39/93.78  0 [] v2557(VarCurr,bitIndex25)| -$F.
% 94.39/93.78  0 [] -v2557(VarCurr,bitIndex24)|$F.
% 94.39/93.78  0 [] v2557(VarCurr,bitIndex24)| -$F.
% 94.39/93.78  0 [] -b0000000000000000(bitIndex15).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex14).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex13).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex12).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex11).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex10).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex9).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex8).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex7).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex6).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex5).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex4).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex3).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex2).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex1).
% 94.39/93.78  0 [] -b0000000000000000(bitIndex0).
% 94.39/93.78  0 [] -range_23_0(B)| -v2559(VarCurr,B)|v2560(VarCurr,B)|v2604(VarCurr,B).
% 94.39/93.78  0 [] -range_23_0(B)|v2559(VarCurr,B)| -v2560(VarCurr,B).
% 94.39/93.78  0 [] -range_23_0(B)|v2559(VarCurr,B)| -v2604(VarCurr,B).
% 94.39/93.78  0 [] -range_23_0(B)| -v2604(VarCurr,B)|v2605(VarCurr,B).
% 94.39/93.78  0 [] -range_23_0(B)| -v2604(VarCurr,B)|v2648(VarCurr,B).
% 94.39/93.78  0 [] -range_23_0(B)|v2604(VarCurr,B)| -v2605(VarCurr,B)| -v2648(VarCurr,B).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -v2648(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] v2648(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex3).
% 94.39/93.78  0 [] -range_15_0(B)| -v2605(VarCurr,B)|v2606(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)|v2605(VarCurr,B)| -v2606(VarCurr,B).
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex23)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex23)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex22)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex22)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex21)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex21)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex20)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex20)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex19)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex19)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex18)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex18)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex17)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex17)| -$F.
% 94.39/93.78  0 [] -v2605(VarCurr,bitIndex16)|$F.
% 94.39/93.78  0 [] v2605(VarCurr,bitIndex16)| -$F.
% 94.39/93.78  0 [] -range_15_0(B)| -v2606(VarCurr,B)|v2607(VarCurr,B)|v2627(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)|v2606(VarCurr,B)| -v2607(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)|v2606(VarCurr,B)| -v2627(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)| -v2627(VarCurr,B)|v2628(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)| -v2627(VarCurr,B)|v2647(VarCurr,B).
% 94.39/93.78  0 [] -range_15_0(B)|v2627(VarCurr,B)| -v2628(VarCurr,B)| -v2647(VarCurr,B).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -v2647(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] v2647(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex2).
% 94.39/93.78  0 [] -range_11_0(B)| -v2628(VarCurr,B)|v2629(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)|v2628(VarCurr,B)| -v2629(VarCurr,B).
% 94.39/93.78  0 [] -v2628(VarCurr,bitIndex15)|$F.
% 94.39/93.78  0 [] v2628(VarCurr,bitIndex15)| -$F.
% 94.39/93.78  0 [] -v2628(VarCurr,bitIndex14)|$F.
% 94.39/93.78  0 [] v2628(VarCurr,bitIndex14)| -$F.
% 94.39/93.78  0 [] -v2628(VarCurr,bitIndex13)|$F.
% 94.39/93.78  0 [] v2628(VarCurr,bitIndex13)| -$F.
% 94.39/93.78  0 [] -v2628(VarCurr,bitIndex12)|$F.
% 94.39/93.78  0 [] v2628(VarCurr,bitIndex12)| -$F.
% 94.39/93.78  0 [] -range_11_0(B)| -v2629(VarCurr,B)|v2630(VarCurr,B)|v2638(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)|v2629(VarCurr,B)| -v2630(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)|v2629(VarCurr,B)| -v2638(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)| -v2638(VarCurr,B)|v2639(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)| -v2638(VarCurr,B)|v2646(VarCurr,B).
% 94.39/93.78  0 [] -range_11_0(B)|v2638(VarCurr,B)| -v2639(VarCurr,B)| -v2646(VarCurr,B).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -v2646(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] v2646(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.39/93.78  0 [] -range_9_0(B)| -v2639(VarCurr,B)|v2640(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)|v2639(VarCurr,B)| -v2640(VarCurr,B).
% 94.39/93.78  0 [] -v2639(VarCurr,bitIndex11)|$F.
% 94.39/93.78  0 [] v2639(VarCurr,bitIndex11)| -$F.
% 94.39/93.78  0 [] -v2639(VarCurr,bitIndex10)|$F.
% 94.39/93.78  0 [] v2639(VarCurr,bitIndex10)| -$F.
% 94.39/93.78  0 [] -range_9_0(B)| -v2640(VarCurr,B)|v2641(VarCurr,B)|v2643(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)|v2640(VarCurr,B)| -v2641(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)|v2640(VarCurr,B)| -v2643(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)| -v2643(VarCurr,B)|v2644(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)| -v2643(VarCurr,B)|v2645(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)|v2643(VarCurr,B)| -v2644(VarCurr,B)| -v2645(VarCurr,B).
% 94.39/93.78  0 [] -range_9_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex0!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex1!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex2!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex3!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex4!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex5!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex6!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex7!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex8!=B.
% 94.39/93.78  0 [] range_9_0(B)|bitIndex9!=B.
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2645(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] v2645(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex39).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex39).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex38).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex38).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex37).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex37).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex36).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex36).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex35).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex35).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex34).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex34).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex33).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex33).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex32).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex32).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex31).
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex31).
% 94.39/93.78  0 [] -v2644(VarCurr,bitIndex9)|$F.
% 94.39/93.78  0 [] v2644(VarCurr,bitIndex9)| -$F.
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex30).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex0)|v2642(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex30)| -v2642(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex31).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex1)|v2642(VarCurr,bitIndex1).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex31)| -v2642(VarCurr,bitIndex1).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex32).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex2)|v2642(VarCurr,bitIndex2).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex32)| -v2642(VarCurr,bitIndex2).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex33).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex3)|v2642(VarCurr,bitIndex3).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex33)| -v2642(VarCurr,bitIndex3).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex34).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex4)|v2642(VarCurr,bitIndex4).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex34)| -v2642(VarCurr,bitIndex4).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex35).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex5)|v2642(VarCurr,bitIndex5).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex35)| -v2642(VarCurr,bitIndex5).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex36).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex6)|v2642(VarCurr,bitIndex6).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex36)| -v2642(VarCurr,bitIndex6).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex37).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex7)|v2642(VarCurr,bitIndex7).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex37)| -v2642(VarCurr,bitIndex7).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex38).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex8)|v2642(VarCurr,bitIndex8).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex38)| -v2642(VarCurr,bitIndex8).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex39).
% 94.39/93.79  0 [] -v2641(VarCurr,bitIndex9)|v2642(VarCurr,bitIndex9).
% 94.39/93.79  0 [] v2641(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex39)| -v2642(VarCurr,bitIndex9).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2642(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.79  0 [] v2642(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.79  0 [] -range_11_0(B)| -v2630(VarCurr,B)|v2631(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)| -v2630(VarCurr,B)|v2637(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)|v2630(VarCurr,B)| -v2631(VarCurr,B)| -v2637(VarCurr,B).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.39/93.79  0 [] -v2637(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.39/93.79  0 [] v2637(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.39/93.79  0 [] -range_11_0(B)| -v2631(VarCurr,B)|v2632(VarCurr,B)|v2634(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)|v2631(VarCurr,B)| -v2632(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)|v2631(VarCurr,B)| -v2634(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)| -v2634(VarCurr,B)|v2635(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)| -v2634(VarCurr,B)|v2636(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)|v2634(VarCurr,B)| -v2635(VarCurr,B)| -v2636(VarCurr,B).
% 94.39/93.79  0 [] -range_11_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex0!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex1!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex2!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex3!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex4!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex5!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex6!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex7!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex8!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex9!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex10!=B.
% 94.39/93.79  0 [] range_11_0(B)|bitIndex11!=B.
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2636(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2636(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex39).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex39).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex38).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex38).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex37).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex37).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex36).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex36).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex35).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex35).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex34).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex34).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex33).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex33).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex32).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex32).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex31).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex31).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex30).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex30).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex29).
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex29).
% 94.39/93.79  0 [] -v2635(VarCurr,bitIndex11)|$F.
% 94.39/93.79  0 [] v2635(VarCurr,bitIndex11)| -$F.
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex28).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex0)|v2633(VarCurr,bitIndex0).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex28)| -v2633(VarCurr,bitIndex0).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex29).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex1)|v2633(VarCurr,bitIndex1).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex29)| -v2633(VarCurr,bitIndex1).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex30).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex2)|v2633(VarCurr,bitIndex2).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex30)| -v2633(VarCurr,bitIndex2).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex31).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex3)|v2633(VarCurr,bitIndex3).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex31)| -v2633(VarCurr,bitIndex3).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex32).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex4)|v2633(VarCurr,bitIndex4).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex32)| -v2633(VarCurr,bitIndex4).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex33).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex5)|v2633(VarCurr,bitIndex5).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex33)| -v2633(VarCurr,bitIndex5).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex34).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex6)|v2633(VarCurr,bitIndex6).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex34)| -v2633(VarCurr,bitIndex6).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex35).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex7)|v2633(VarCurr,bitIndex7).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex35)| -v2633(VarCurr,bitIndex7).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex36).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex8)|v2633(VarCurr,bitIndex8).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex36)| -v2633(VarCurr,bitIndex8).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex37).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex9)|v2633(VarCurr,bitIndex9).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex37)| -v2633(VarCurr,bitIndex9).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex38).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex10)|v2633(VarCurr,bitIndex10).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex38)| -v2633(VarCurr,bitIndex10).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex39).
% 94.39/93.79  0 [] -v2632(VarCurr,bitIndex11)|v2633(VarCurr,bitIndex11).
% 94.39/93.79  0 [] v2632(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex39)| -v2633(VarCurr,bitIndex11).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.39/93.79  0 [] -v2633(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.39/93.79  0 [] v2633(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.39/93.79  0 [] -range_15_0(B)| -v2607(VarCurr,B)|v2608(VarCurr,B).
% 94.39/93.79  0 [] -range_15_0(B)| -v2607(VarCurr,B)|v2626(VarCurr,B).
% 94.39/93.79  0 [] -range_15_0(B)|v2607(VarCurr,B)| -v2608(VarCurr,B)| -v2626(VarCurr,B).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex0)|v2484(VarCurr).
% 94.39/93.79  0 [] v2626(VarCurr,bitIndex0)| -v2484(VarCurr).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex1)|v2484(VarCurr).
% 94.39/93.79  0 [] v2626(VarCurr,bitIndex1)| -v2484(VarCurr).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex2)|v2484(VarCurr).
% 94.39/93.79  0 [] v2626(VarCurr,bitIndex2)| -v2484(VarCurr).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex3)|v2484(VarCurr).
% 94.39/93.79  0 [] v2626(VarCurr,bitIndex3)| -v2484(VarCurr).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex4)|v2484(VarCurr).
% 94.39/93.79  0 [] v2626(VarCurr,bitIndex4)| -v2484(VarCurr).
% 94.39/93.79  0 [] -v2626(VarCurr,bitIndex5)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex5)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex6)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex6)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex7)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex7)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex8)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex8)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex9)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex9)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex10)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex10)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex11)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex11)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex12)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex12)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex13)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex13)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex14)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex14)| -v2484(VarCurr).
% 94.39/93.80  0 [] -v2626(VarCurr,bitIndex15)|v2484(VarCurr).
% 94.39/93.80  0 [] v2626(VarCurr,bitIndex15)| -v2484(VarCurr).
% 94.39/93.80  0 [] -range_15_0(B)| -v2608(VarCurr,B)|v2609(VarCurr,B)|v2617(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2608(VarCurr,B)| -v2609(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2608(VarCurr,B)| -v2617(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)| -v2617(VarCurr,B)|v2618(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)| -v2617(VarCurr,B)|v2625(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2617(VarCurr,B)| -v2618(VarCurr,B)| -v2625(VarCurr,B).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2625(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2625(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -range_13_0(B)| -v2618(VarCurr,B)|v2619(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)|v2618(VarCurr,B)| -v2619(VarCurr,B).
% 94.39/93.80  0 [] -v2618(VarCurr,bitIndex15)|$F.
% 94.39/93.80  0 [] v2618(VarCurr,bitIndex15)| -$F.
% 94.39/93.80  0 [] -v2618(VarCurr,bitIndex14)|$F.
% 94.39/93.80  0 [] v2618(VarCurr,bitIndex14)| -$F.
% 94.39/93.80  0 [] -range_13_0(B)| -v2619(VarCurr,B)|v2620(VarCurr,B)|v2622(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)|v2619(VarCurr,B)| -v2620(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)|v2619(VarCurr,B)| -v2622(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)| -v2622(VarCurr,B)|v2623(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)| -v2622(VarCurr,B)|v2624(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)|v2622(VarCurr,B)| -v2623(VarCurr,B)| -v2624(VarCurr,B).
% 94.39/93.80  0 [] -range_13_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex0!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex1!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex2!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex3!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex4!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex5!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex6!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex7!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex8!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex9!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex10!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex11!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex12!=B.
% 94.39/93.80  0 [] range_13_0(B)|bitIndex13!=B.
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2624(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2624(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex39).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex39).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex38).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex38).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex37).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex37).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex36).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex36).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex35).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex35).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex34).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex34).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex33).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex33).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex32).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex32).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex31).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex31).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex30).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex30).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex29).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex29).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex28).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex28).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex27).
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex27).
% 94.39/93.80  0 [] -v2623(VarCurr,bitIndex13)|$F.
% 94.39/93.80  0 [] v2623(VarCurr,bitIndex13)| -$F.
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex26).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex0)|v2621(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex26)| -v2621(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex27).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex1)|v2621(VarCurr,bitIndex1).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex27)| -v2621(VarCurr,bitIndex1).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex28).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex2)|v2621(VarCurr,bitIndex2).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex28)| -v2621(VarCurr,bitIndex2).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex29).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex3)|v2621(VarCurr,bitIndex3).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex29)| -v2621(VarCurr,bitIndex3).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex30).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex4)|v2621(VarCurr,bitIndex4).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex30)| -v2621(VarCurr,bitIndex4).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex31).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex5)|v2621(VarCurr,bitIndex5).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex31)| -v2621(VarCurr,bitIndex5).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex32).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex6)|v2621(VarCurr,bitIndex6).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex32)| -v2621(VarCurr,bitIndex6).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex33).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex7)|v2621(VarCurr,bitIndex7).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex33)| -v2621(VarCurr,bitIndex7).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex34).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex8)|v2621(VarCurr,bitIndex8).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex34)| -v2621(VarCurr,bitIndex8).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex35).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex9)|v2621(VarCurr,bitIndex9).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex35)| -v2621(VarCurr,bitIndex9).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex36).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex10)|v2621(VarCurr,bitIndex10).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex36)| -v2621(VarCurr,bitIndex10).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex37).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex11)|v2621(VarCurr,bitIndex11).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex37)| -v2621(VarCurr,bitIndex11).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex38).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex12)|v2621(VarCurr,bitIndex12).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex38)| -v2621(VarCurr,bitIndex12).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex39).
% 94.39/93.80  0 [] -v2620(VarCurr,bitIndex13)|v2621(VarCurr,bitIndex13).
% 94.39/93.80  0 [] v2620(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex39)| -v2621(VarCurr,bitIndex13).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.39/93.80  0 [] -v2621(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.39/93.80  0 [] v2621(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.39/93.80  0 [] -range_15_0(B)| -v2609(VarCurr,B)|v2610(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)| -v2609(VarCurr,B)|v2616(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2609(VarCurr,B)| -v2610(VarCurr,B)| -v2616(VarCurr,B).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.39/93.80  0 [] -v2616(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.39/93.80  0 [] v2616(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.39/93.80  0 [] -range_15_0(B)| -v2610(VarCurr,B)|v2611(VarCurr,B)|v2613(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2610(VarCurr,B)| -v2611(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2610(VarCurr,B)| -v2613(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)| -v2613(VarCurr,B)|v2614(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)| -v2613(VarCurr,B)|v2615(VarCurr,B).
% 94.39/93.80  0 [] -range_15_0(B)|v2613(VarCurr,B)| -v2614(VarCurr,B)| -v2615(VarCurr,B).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2615(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] v2615(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.39/93.80  0 [] -v2614(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex39).
% 94.39/93.80  0 [] v2614(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex39).
% 94.39/93.80  0 [] -v2614(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex38).
% 94.39/93.80  0 [] v2614(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex38).
% 94.39/93.80  0 [] -v2614(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex37).
% 94.39/93.80  0 [] v2614(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex37).
% 94.39/93.80  0 [] -v2614(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex36).
% 94.39/93.80  0 [] v2614(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex36).
% 94.39/93.80  0 [] -v2614(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex35).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex35).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex34).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex34).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex33).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex33).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex32).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex32).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex31).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex31).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex30).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex30).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex29).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex29).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex28).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex28).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex27).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex27).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex26).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex26).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex25).
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex25).
% 94.39/93.81  0 [] -v2614(VarCurr,bitIndex15)|$F.
% 94.39/93.81  0 [] v2614(VarCurr,bitIndex15)| -$F.
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex24).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex0)|v2612(VarCurr,bitIndex0).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex24)| -v2612(VarCurr,bitIndex0).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex25).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex1)|v2612(VarCurr,bitIndex1).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex25)| -v2612(VarCurr,bitIndex1).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex26).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex2)|v2612(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex26)| -v2612(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex27).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex3)|v2612(VarCurr,bitIndex3).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex27)| -v2612(VarCurr,bitIndex3).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex28).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex4)|v2612(VarCurr,bitIndex4).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex28)| -v2612(VarCurr,bitIndex4).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex29).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex5)|v2612(VarCurr,bitIndex5).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex29)| -v2612(VarCurr,bitIndex5).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex30).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex6)|v2612(VarCurr,bitIndex6).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex30)| -v2612(VarCurr,bitIndex6).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex31).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex7)|v2612(VarCurr,bitIndex7).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex31)| -v2612(VarCurr,bitIndex7).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex32).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex8)|v2612(VarCurr,bitIndex8).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex32)| -v2612(VarCurr,bitIndex8).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex33).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex9)|v2612(VarCurr,bitIndex9).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex33)| -v2612(VarCurr,bitIndex9).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex34).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex10)|v2612(VarCurr,bitIndex10).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex34)| -v2612(VarCurr,bitIndex10).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex35).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex11)|v2612(VarCurr,bitIndex11).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex35)| -v2612(VarCurr,bitIndex11).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex36).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex12)|v2612(VarCurr,bitIndex12).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex36)| -v2612(VarCurr,bitIndex12).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex37).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex13)|v2612(VarCurr,bitIndex13).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex37)| -v2612(VarCurr,bitIndex13).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex38).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex14)|v2612(VarCurr,bitIndex14).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex38)| -v2612(VarCurr,bitIndex14).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex39).
% 94.39/93.81  0 [] -v2611(VarCurr,bitIndex15)|v2612(VarCurr,bitIndex15).
% 94.39/93.81  0 [] v2611(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex39)| -v2612(VarCurr,bitIndex15).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.39/93.81  0 [] -v2612(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.39/93.81  0 [] v2612(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.39/93.81  0 [] -range_23_0(B)| -v2560(VarCurr,B)|v2561(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)| -v2560(VarCurr,B)|v2603(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)|v2560(VarCurr,B)| -v2561(VarCurr,B)| -v2603(VarCurr,B).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex0)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex0)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex1)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex1)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex2)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex2)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex3)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex3)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex4)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex4)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex5)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex5)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex6)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex6)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex7)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex7)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex8)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex8)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex9)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex9)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex10)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex10)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex11)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex11)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex12)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex12)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex13)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex13)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex14)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex14)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex15)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex15)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex16)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex16)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex17)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex17)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex18)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex18)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex19)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex19)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex20)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex20)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex21)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex21)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex22)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex22)| -v2507(VarCurr).
% 94.39/93.81  0 [] -v2603(VarCurr,bitIndex23)|v2507(VarCurr).
% 94.39/93.81  0 [] v2603(VarCurr,bitIndex23)| -v2507(VarCurr).
% 94.39/93.81  0 [] -range_23_0(B)| -v2561(VarCurr,B)|v2562(VarCurr,B)|v2582(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)|v2561(VarCurr,B)| -v2562(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)|v2561(VarCurr,B)| -v2582(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)| -v2582(VarCurr,B)|v2583(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)| -v2582(VarCurr,B)|v2602(VarCurr,B).
% 94.39/93.81  0 [] -range_23_0(B)|v2582(VarCurr,B)| -v2583(VarCurr,B)| -v2602(VarCurr,B).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -v2602(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] v2602(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex2).
% 94.39/93.81  0 [] -range_19_0(B)| -v2583(VarCurr,B)|v2584(VarCurr,B).
% 94.39/93.81  0 [] -range_19_0(B)|v2583(VarCurr,B)| -v2584(VarCurr,B).
% 94.39/93.81  0 [] -v2583(VarCurr,bitIndex23)|$F.
% 94.39/93.81  0 [] v2583(VarCurr,bitIndex23)| -$F.
% 94.39/93.81  0 [] -v2583(VarCurr,bitIndex22)|$F.
% 94.39/93.81  0 [] v2583(VarCurr,bitIndex22)| -$F.
% 94.39/93.81  0 [] -v2583(VarCurr,bitIndex21)|$F.
% 94.39/93.81  0 [] v2583(VarCurr,bitIndex21)| -$F.
% 94.39/93.81  0 [] -v2583(VarCurr,bitIndex20)|$F.
% 94.39/93.81  0 [] v2583(VarCurr,bitIndex20)| -$F.
% 94.39/93.81  0 [] -range_19_0(B)| -v2584(VarCurr,B)|v2585(VarCurr,B)|v2593(VarCurr,B).
% 94.39/93.81  0 [] -range_19_0(B)|v2584(VarCurr,B)| -v2585(VarCurr,B).
% 94.39/93.82  0 [] -range_19_0(B)|v2584(VarCurr,B)| -v2593(VarCurr,B).
% 94.39/93.82  0 [] -range_19_0(B)| -v2593(VarCurr,B)|v2594(VarCurr,B).
% 94.39/93.82  0 [] -range_19_0(B)| -v2593(VarCurr,B)|v2601(VarCurr,B).
% 94.39/93.82  0 [] -range_19_0(B)|v2593(VarCurr,B)| -v2594(VarCurr,B)| -v2601(VarCurr,B).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2601(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2601(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -range_17_0(B)| -v2594(VarCurr,B)|v2595(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)|v2594(VarCurr,B)| -v2595(VarCurr,B).
% 94.39/93.82  0 [] -v2594(VarCurr,bitIndex19)|$F.
% 94.39/93.82  0 [] v2594(VarCurr,bitIndex19)| -$F.
% 94.39/93.82  0 [] -v2594(VarCurr,bitIndex18)|$F.
% 94.39/93.82  0 [] v2594(VarCurr,bitIndex18)| -$F.
% 94.39/93.82  0 [] -range_17_0(B)| -v2595(VarCurr,B)|v2596(VarCurr,B)|v2598(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)|v2595(VarCurr,B)| -v2596(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)|v2595(VarCurr,B)| -v2598(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)| -v2598(VarCurr,B)|v2599(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)| -v2598(VarCurr,B)|v2600(VarCurr,B).
% 94.39/93.82  0 [] -range_17_0(B)|v2598(VarCurr,B)| -v2599(VarCurr,B)| -v2600(VarCurr,B).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2600(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2600(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex39).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex39).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex38).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex38).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex37).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex37).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex36).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex36).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex35).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex35).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex34).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex34).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex33).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex33).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex32).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex32).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex31).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex31).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex30).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex30).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex29).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex29).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex28).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex28).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex27).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex27).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex26).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex26).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex25).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex25).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex24).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex24).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex23).
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex23).
% 94.39/93.82  0 [] -v2599(VarCurr,bitIndex17)|$F.
% 94.39/93.82  0 [] v2599(VarCurr,bitIndex17)| -$F.
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex22).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex0)|v2597(VarCurr,bitIndex0).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex22)| -v2597(VarCurr,bitIndex0).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex23).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex1)|v2597(VarCurr,bitIndex1).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex23)| -v2597(VarCurr,bitIndex1).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex24).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex2)|v2597(VarCurr,bitIndex2).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex24)| -v2597(VarCurr,bitIndex2).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex25).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex3)|v2597(VarCurr,bitIndex3).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex25)| -v2597(VarCurr,bitIndex3).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex26).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex4)|v2597(VarCurr,bitIndex4).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex26)| -v2597(VarCurr,bitIndex4).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex27).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex5)|v2597(VarCurr,bitIndex5).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex27)| -v2597(VarCurr,bitIndex5).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex28).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex6)|v2597(VarCurr,bitIndex6).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex28)| -v2597(VarCurr,bitIndex6).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex29).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex7)|v2597(VarCurr,bitIndex7).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex29)| -v2597(VarCurr,bitIndex7).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex30).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex8)|v2597(VarCurr,bitIndex8).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex30)| -v2597(VarCurr,bitIndex8).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex31).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex9)|v2597(VarCurr,bitIndex9).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex31)| -v2597(VarCurr,bitIndex9).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex32).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex10)|v2597(VarCurr,bitIndex10).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex32)| -v2597(VarCurr,bitIndex10).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex33).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex11)|v2597(VarCurr,bitIndex11).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex33)| -v2597(VarCurr,bitIndex11).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex34).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex12)|v2597(VarCurr,bitIndex12).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex34)| -v2597(VarCurr,bitIndex12).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex35).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex13)|v2597(VarCurr,bitIndex13).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex35)| -v2597(VarCurr,bitIndex13).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex36).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex14)|v2597(VarCurr,bitIndex14).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex36)| -v2597(VarCurr,bitIndex14).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex37).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex15)|v2597(VarCurr,bitIndex15).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex37)| -v2597(VarCurr,bitIndex15).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex38).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex16)|v2597(VarCurr,bitIndex16).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex38)| -v2597(VarCurr,bitIndex16).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex39).
% 94.39/93.82  0 [] -v2596(VarCurr,bitIndex17)|v2597(VarCurr,bitIndex17).
% 94.39/93.82  0 [] v2596(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex39)| -v2597(VarCurr,bitIndex17).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.39/93.82  0 [] -v2597(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.39/93.82  0 [] v2597(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2597(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.39/93.83  0 [] v2597(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2597(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.39/93.83  0 [] v2597(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2597(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.39/93.83  0 [] v2597(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2597(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.39/93.83  0 [] v2597(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2597(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.39/93.83  0 [] v2597(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.39/93.83  0 [] -range_19_0(B)| -v2585(VarCurr,B)|v2586(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)| -v2585(VarCurr,B)|v2592(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)|v2585(VarCurr,B)| -v2586(VarCurr,B)| -v2592(VarCurr,B).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.39/93.83  0 [] -v2592(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.39/93.83  0 [] v2592(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.39/93.83  0 [] -range_19_0(B)| -v2586(VarCurr,B)|v2587(VarCurr,B)|v2589(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)|v2586(VarCurr,B)| -v2587(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)|v2586(VarCurr,B)| -v2589(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)| -v2589(VarCurr,B)|v2590(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)| -v2589(VarCurr,B)|v2591(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)|v2589(VarCurr,B)| -v2590(VarCurr,B)| -v2591(VarCurr,B).
% 94.39/93.83  0 [] -range_19_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex0!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex1!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex2!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex3!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex4!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex5!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex6!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex7!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex8!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex9!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex10!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex11!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex12!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex13!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex14!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex15!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex16!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex17!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex18!=B.
% 94.39/93.83  0 [] range_19_0(B)|bitIndex19!=B.
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2591(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2591(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex39).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex39).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex38).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex38).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex37).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex37).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex36).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex36).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex35).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex35).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex34).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex34).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex33).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex33).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex32).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex32).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex31).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex31).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex30).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex30).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex29).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex29).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex28).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex28).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex27).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex27).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex26).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex26).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex25).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex25).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex24).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex24).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex23).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex23).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex22).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex22).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex21).
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex21).
% 94.39/93.83  0 [] -v2590(VarCurr,bitIndex19)|$F.
% 94.39/93.83  0 [] v2590(VarCurr,bitIndex19)| -$F.
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex20).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex0)|v2588(VarCurr,bitIndex0).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex20)| -v2588(VarCurr,bitIndex0).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex21).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex1)|v2588(VarCurr,bitIndex1).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex21)| -v2588(VarCurr,bitIndex1).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex22).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex2)|v2588(VarCurr,bitIndex2).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex22)| -v2588(VarCurr,bitIndex2).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex23).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex3)|v2588(VarCurr,bitIndex3).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex23)| -v2588(VarCurr,bitIndex3).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex24).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex4)|v2588(VarCurr,bitIndex4).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex24)| -v2588(VarCurr,bitIndex4).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex25).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex5)|v2588(VarCurr,bitIndex5).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex25)| -v2588(VarCurr,bitIndex5).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex26).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex6)|v2588(VarCurr,bitIndex6).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex26)| -v2588(VarCurr,bitIndex6).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex27).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex7)|v2588(VarCurr,bitIndex7).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex27)| -v2588(VarCurr,bitIndex7).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex28).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex8)|v2588(VarCurr,bitIndex8).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex28)| -v2588(VarCurr,bitIndex8).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex29).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex9)|v2588(VarCurr,bitIndex9).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex29)| -v2588(VarCurr,bitIndex9).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex30).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex10)|v2588(VarCurr,bitIndex10).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex30)| -v2588(VarCurr,bitIndex10).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex31).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex11)|v2588(VarCurr,bitIndex11).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex31)| -v2588(VarCurr,bitIndex11).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex32).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex12)|v2588(VarCurr,bitIndex12).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex32)| -v2588(VarCurr,bitIndex12).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex33).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex13)|v2588(VarCurr,bitIndex13).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex33)| -v2588(VarCurr,bitIndex13).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex34).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex14)|v2588(VarCurr,bitIndex14).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex34)| -v2588(VarCurr,bitIndex14).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex35).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex15)|v2588(VarCurr,bitIndex15).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex35)| -v2588(VarCurr,bitIndex15).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex36).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex16)|v2588(VarCurr,bitIndex16).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex36)| -v2588(VarCurr,bitIndex16).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex37).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex17)|v2588(VarCurr,bitIndex17).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex37)| -v2588(VarCurr,bitIndex17).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex38).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex18)|v2588(VarCurr,bitIndex18).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex38)| -v2588(VarCurr,bitIndex18).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex39).
% 94.39/93.83  0 [] -v2587(VarCurr,bitIndex19)|v2588(VarCurr,bitIndex19).
% 94.39/93.83  0 [] v2587(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex39)| -v2588(VarCurr,bitIndex19).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.39/93.83  0 [] -v2588(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.39/93.83  0 [] v2588(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.39/93.83  0 [] -range_23_0(B)| -v2562(VarCurr,B)|v2563(VarCurr,B).
% 94.39/93.83  0 [] -range_23_0(B)| -v2562(VarCurr,B)|v2581(VarCurr,B).
% 94.39/93.83  0 [] -range_23_0(B)|v2562(VarCurr,B)| -v2563(VarCurr,B)| -v2581(VarCurr,B).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex0)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex0)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex1)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex1)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex2)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex2)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex3)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex3)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex4)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex4)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex5)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex5)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex6)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex6)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex7)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex7)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex8)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex8)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex9)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex9)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex10)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex10)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex11)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex11)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex12)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex12)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex13)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex13)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex14)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex14)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex15)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex15)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex16)|v2484(VarCurr).
% 94.39/93.83  0 [] v2581(VarCurr,bitIndex16)| -v2484(VarCurr).
% 94.39/93.83  0 [] -v2581(VarCurr,bitIndex17)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex17)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex18)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex18)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex19)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex19)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex20)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex20)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex21)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex21)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex22)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex22)| -v2484(VarCurr).
% 94.48/93.84  0 [] -v2581(VarCurr,bitIndex23)|v2484(VarCurr).
% 94.48/93.84  0 [] v2581(VarCurr,bitIndex23)| -v2484(VarCurr).
% 94.48/93.84  0 [] -range_23_0(B)| -v2563(VarCurr,B)|v2564(VarCurr,B)|v2572(VarCurr,B).
% 94.48/93.84  0 [] -range_23_0(B)|v2563(VarCurr,B)| -v2564(VarCurr,B).
% 94.48/93.84  0 [] -range_23_0(B)|v2563(VarCurr,B)| -v2572(VarCurr,B).
% 94.48/93.84  0 [] -range_23_0(B)| -v2572(VarCurr,B)|v2573(VarCurr,B).
% 94.48/93.84  0 [] -range_23_0(B)| -v2572(VarCurr,B)|v2580(VarCurr,B).
% 94.48/93.84  0 [] -range_23_0(B)|v2572(VarCurr,B)| -v2573(VarCurr,B)| -v2580(VarCurr,B).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2580(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2580(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -range_21_0(B)| -v2573(VarCurr,B)|v2574(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)|v2573(VarCurr,B)| -v2574(VarCurr,B).
% 94.48/93.84  0 [] -v2573(VarCurr,bitIndex23)|$F.
% 94.48/93.84  0 [] v2573(VarCurr,bitIndex23)| -$F.
% 94.48/93.84  0 [] -v2573(VarCurr,bitIndex22)|$F.
% 94.48/93.84  0 [] v2573(VarCurr,bitIndex22)| -$F.
% 94.48/93.84  0 [] -range_21_0(B)| -v2574(VarCurr,B)|v2575(VarCurr,B)|v2577(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)|v2574(VarCurr,B)| -v2575(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)|v2574(VarCurr,B)| -v2577(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)| -v2577(VarCurr,B)|v2578(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)| -v2577(VarCurr,B)|v2579(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)|v2577(VarCurr,B)| -v2578(VarCurr,B)| -v2579(VarCurr,B).
% 94.48/93.84  0 [] -range_21_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex0!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex1!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex2!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex3!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex4!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex5!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex6!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex7!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex8!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex9!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex10!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex11!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex12!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex13!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex14!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex15!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex16!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex17!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex18!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex19!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex20!=B.
% 94.48/93.84  0 [] range_21_0(B)|bitIndex21!=B.
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2579(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2579(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex39).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex39).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex38).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex38).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex37).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex37).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex36).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex36).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex35).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex35).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex34).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex34).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex33).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex33).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex32).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex32).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex31).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex31).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex30).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex30).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex29).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex29).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex28).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex28).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex27).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex27).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex26).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex26).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex25).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex25).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex24).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex24).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex23).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex23).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex22).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex22).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex21).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex21).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex20).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex20).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex19).
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex19).
% 94.48/93.84  0 [] -v2578(VarCurr,bitIndex21)|$F.
% 94.48/93.84  0 [] v2578(VarCurr,bitIndex21)| -$F.
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex18).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex0)|v2576(VarCurr,bitIndex0).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex18)| -v2576(VarCurr,bitIndex0).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex19).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex1)|v2576(VarCurr,bitIndex1).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex19)| -v2576(VarCurr,bitIndex1).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex20).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex2)|v2576(VarCurr,bitIndex2).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex20)| -v2576(VarCurr,bitIndex2).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex21).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex3)|v2576(VarCurr,bitIndex3).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex21)| -v2576(VarCurr,bitIndex3).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex22).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex4)|v2576(VarCurr,bitIndex4).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex22)| -v2576(VarCurr,bitIndex4).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex23).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex5)|v2576(VarCurr,bitIndex5).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex23)| -v2576(VarCurr,bitIndex5).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex24).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex6)|v2576(VarCurr,bitIndex6).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex24)| -v2576(VarCurr,bitIndex6).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex25).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex7)|v2576(VarCurr,bitIndex7).
% 94.48/93.84  0 [] v2575(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex25)| -v2576(VarCurr,bitIndex7).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex26).
% 94.48/93.84  0 [] -v2575(VarCurr,bitIndex8)|v2576(VarCurr,bitIndex8).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex26)| -v2576(VarCurr,bitIndex8).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex27).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex9)|v2576(VarCurr,bitIndex9).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex27)| -v2576(VarCurr,bitIndex9).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex28).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex10)|v2576(VarCurr,bitIndex10).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex28)| -v2576(VarCurr,bitIndex10).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex29).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex11)|v2576(VarCurr,bitIndex11).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex29)| -v2576(VarCurr,bitIndex11).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex30).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex12)|v2576(VarCurr,bitIndex12).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex30)| -v2576(VarCurr,bitIndex12).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex31).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex13)|v2576(VarCurr,bitIndex13).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex31)| -v2576(VarCurr,bitIndex13).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex32).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex14)|v2576(VarCurr,bitIndex14).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex32)| -v2576(VarCurr,bitIndex14).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex33).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex15)|v2576(VarCurr,bitIndex15).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex33)| -v2576(VarCurr,bitIndex15).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex34).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex16)|v2576(VarCurr,bitIndex16).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex34)| -v2576(VarCurr,bitIndex16).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex35).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex17)|v2576(VarCurr,bitIndex17).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex35)| -v2576(VarCurr,bitIndex17).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex36).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex18)|v2576(VarCurr,bitIndex18).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex36)| -v2576(VarCurr,bitIndex18).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex37).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex19)|v2576(VarCurr,bitIndex19).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex37)| -v2576(VarCurr,bitIndex19).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex38).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex20)|v2576(VarCurr,bitIndex20).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex38)| -v2576(VarCurr,bitIndex20).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex39).
% 94.48/93.85  0 [] -v2575(VarCurr,bitIndex21)|v2576(VarCurr,bitIndex21).
% 94.48/93.85  0 [] v2575(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex39)| -v2576(VarCurr,bitIndex21).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.48/93.85  0 [] -v2576(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.48/93.85  0 [] v2576(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.48/93.85  0 [] -range_23_0(B)| -v2564(VarCurr,B)|v2565(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)| -v2564(VarCurr,B)|v2571(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)|v2564(VarCurr,B)| -v2565(VarCurr,B)| -v2571(VarCurr,B).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex20)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex20)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex21)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex21)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex22)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex22)| -v2473(VarCurr).
% 94.48/93.85  0 [] -v2571(VarCurr,bitIndex23)|v2473(VarCurr).
% 94.48/93.85  0 [] v2571(VarCurr,bitIndex23)| -v2473(VarCurr).
% 94.48/93.85  0 [] -range_23_0(B)| -v2565(VarCurr,B)|v2566(VarCurr,B)|v2568(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)|v2565(VarCurr,B)| -v2566(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)|v2565(VarCurr,B)| -v2568(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)| -v2568(VarCurr,B)|v2569(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)| -v2568(VarCurr,B)|v2570(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)|v2568(VarCurr,B)| -v2569(VarCurr,B)| -v2570(VarCurr,B).
% 94.48/93.85  0 [] -range_23_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex0!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex1!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex2!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex3!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex4!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex5!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex6!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex7!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex8!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex9!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex10!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex11!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex12!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex13!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex14!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex15!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex16!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex17!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex18!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex19!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex20!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex21!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex22!=B.
% 94.48/93.85  0 [] range_23_0(B)|bitIndex23!=B.
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2570(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] v2570(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.48/93.85  0 [] -v2569(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex39).
% 94.48/93.85  0 [] v2569(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex39).
% 94.48/93.85  0 [] -v2569(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex38).
% 94.48/93.85  0 [] v2569(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex38).
% 94.48/93.85  0 [] -v2569(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex37).
% 94.48/93.85  0 [] v2569(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex37).
% 94.48/93.85  0 [] -v2569(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex36).
% 94.48/93.85  0 [] v2569(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex36).
% 94.48/93.85  0 [] -v2569(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex35).
% 94.48/93.85  0 [] v2569(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex35).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex34).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex34).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex33).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex33).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex32).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex32).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex31).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex31).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex30).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex30).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex29).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex29).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex28).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex28).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex27).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex27).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex26).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex26).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex25).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex25).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex24).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex24).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex23).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex23).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex22).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex22).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex21).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex21).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex20).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex20).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex19).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex19).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex18).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex18).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex17).
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex17).
% 94.48/93.86  0 [] -v2569(VarCurr,bitIndex23)|$F.
% 94.48/93.86  0 [] v2569(VarCurr,bitIndex23)| -$F.
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex16).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex0)|v2567(VarCurr,bitIndex0).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex16)| -v2567(VarCurr,bitIndex0).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex17).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex1)|v2567(VarCurr,bitIndex1).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex17)| -v2567(VarCurr,bitIndex1).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex18).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex2)|v2567(VarCurr,bitIndex2).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex18)| -v2567(VarCurr,bitIndex2).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex19).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex3)|v2567(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex19)| -v2567(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex20).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex4)|v2567(VarCurr,bitIndex4).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex20)| -v2567(VarCurr,bitIndex4).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex21).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex5)|v2567(VarCurr,bitIndex5).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex21)| -v2567(VarCurr,bitIndex5).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex22).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex6)|v2567(VarCurr,bitIndex6).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex22)| -v2567(VarCurr,bitIndex6).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex23).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex7)|v2567(VarCurr,bitIndex7).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex23)| -v2567(VarCurr,bitIndex7).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex24).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex8)|v2567(VarCurr,bitIndex8).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex24)| -v2567(VarCurr,bitIndex8).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex25).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex9)|v2567(VarCurr,bitIndex9).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex25)| -v2567(VarCurr,bitIndex9).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex26).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex10)|v2567(VarCurr,bitIndex10).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex26)| -v2567(VarCurr,bitIndex10).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex27).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex11)|v2567(VarCurr,bitIndex11).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex27)| -v2567(VarCurr,bitIndex11).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex28).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex12)|v2567(VarCurr,bitIndex12).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex28)| -v2567(VarCurr,bitIndex12).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex29).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex13)|v2567(VarCurr,bitIndex13).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex29)| -v2567(VarCurr,bitIndex13).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex30).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex14)|v2567(VarCurr,bitIndex14).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex30)| -v2567(VarCurr,bitIndex14).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex31).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex15)|v2567(VarCurr,bitIndex15).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex31)| -v2567(VarCurr,bitIndex15).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex32).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex16)|v2567(VarCurr,bitIndex16).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex32)| -v2567(VarCurr,bitIndex16).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex33).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex17)|v2567(VarCurr,bitIndex17).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex33)| -v2567(VarCurr,bitIndex17).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex34).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex18)|v2567(VarCurr,bitIndex18).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex34)| -v2567(VarCurr,bitIndex18).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex35).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex19)|v2567(VarCurr,bitIndex19).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex35)| -v2567(VarCurr,bitIndex19).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex36).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex20)|v2567(VarCurr,bitIndex20).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex36)| -v2567(VarCurr,bitIndex20).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex37).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex21)|v2567(VarCurr,bitIndex21).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex37)| -v2567(VarCurr,bitIndex21).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex38).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex22)|v2567(VarCurr,bitIndex22).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex38)| -v2567(VarCurr,bitIndex22).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex39).
% 94.48/93.86  0 [] -v2566(VarCurr,bitIndex23)|v2567(VarCurr,bitIndex23).
% 94.48/93.86  0 [] v2566(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex39)| -v2567(VarCurr,bitIndex23).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.48/93.86  0 [] -v2567(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.48/93.86  0 [] v2567(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.48/93.86  0 [] -range_39_0(B)| -v2456(VarCurr,B)|v2457(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)| -v2456(VarCurr,B)|v2554(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)|v2456(VarCurr,B)| -v2457(VarCurr,B)| -v2554(VarCurr,B).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex0)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex0)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex1)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex1)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex2)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex2)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex3)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex3)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex4)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex4)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex5)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex5)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex6)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex6)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex7)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex7)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex8)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex8)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex9)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex9)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex10)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex10)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex11)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex11)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex12)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex12)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex13)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex13)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex14)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex14)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex15)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex15)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex16)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex16)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex17)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex17)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex18)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex18)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex19)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex19)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex20)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex20)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex21)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex21)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex22)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex22)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex23)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex23)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex24)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex24)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex25)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex25)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex26)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex26)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex27)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex27)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex28)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex28)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex29)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex29)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex30)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex30)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex31)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex31)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex32)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex32)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex33)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex33)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex34)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex34)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex35)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex35)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex36)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex36)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex37)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex37)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex38)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex38)| -v2555(VarCurr).
% 94.48/93.86  0 [] -v2554(VarCurr,bitIndex39)|v2555(VarCurr).
% 94.48/93.86  0 [] v2554(VarCurr,bitIndex39)| -v2555(VarCurr).
% 94.48/93.86  0 [] v2555(VarCurr)|v2453(VarCurr,bitIndex4).
% 94.48/93.86  0 [] -v2555(VarCurr)| -v2453(VarCurr,bitIndex4).
% 94.48/93.86  0 [] -range_39_0(B)| -v2457(VarCurr,B)|v2458(VarCurr,B)|v2508(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)|v2457(VarCurr,B)| -v2458(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)|v2457(VarCurr,B)| -v2508(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)| -v2508(VarCurr,B)|v2509(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)| -v2508(VarCurr,B)|v2553(VarCurr,B).
% 94.48/93.86  0 [] -range_39_0(B)|v2508(VarCurr,B)| -v2509(VarCurr,B)| -v2553(VarCurr,B).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] v2553(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex3).
% 94.48/93.86  0 [] -v2553(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex38)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex38)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -v2553(VarCurr,bitIndex39)|v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] v2553(VarCurr,bitIndex39)| -v2453(VarCurr,bitIndex3).
% 94.48/93.87  0 [] -range_31_0(B)| -v2509(VarCurr,B)|v2511(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)|v2509(VarCurr,B)| -v2511(VarCurr,B).
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex39)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex39)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex38)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex38)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex37)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex37)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex36)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex36)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex35)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex35)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex34)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex34)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex33)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex33)| -$F.
% 94.48/93.87  0 [] -v2509(VarCurr,bitIndex32)|$F.
% 94.48/93.87  0 [] v2509(VarCurr,bitIndex32)| -$F.
% 94.48/93.87  0 [] -b00000000(bitIndex7).
% 94.48/93.87  0 [] -b00000000(bitIndex6).
% 94.48/93.87  0 [] -b00000000(bitIndex5).
% 94.48/93.87  0 [] -b00000000(bitIndex4).
% 94.48/93.87  0 [] -b00000000(bitIndex3).
% 94.48/93.87  0 [] -b00000000(bitIndex2).
% 94.48/93.87  0 [] -b00000000(bitIndex1).
% 94.48/93.87  0 [] -b00000000(bitIndex0).
% 94.48/93.87  0 [] -range_31_0(B)| -v2511(VarCurr,B)|v2512(VarCurr,B)|v2532(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)|v2511(VarCurr,B)| -v2512(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)|v2511(VarCurr,B)| -v2532(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)| -v2532(VarCurr,B)|v2533(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)| -v2532(VarCurr,B)|v2552(VarCurr,B).
% 94.48/93.87  0 [] -range_31_0(B)|v2532(VarCurr,B)| -v2533(VarCurr,B)| -v2552(VarCurr,B).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -v2552(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] v2552(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex2).
% 94.48/93.87  0 [] -range_27_0(B)| -v2533(VarCurr,B)|v2534(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)|v2533(VarCurr,B)| -v2534(VarCurr,B).
% 94.48/93.87  0 [] -v2533(VarCurr,bitIndex31)|$F.
% 94.48/93.87  0 [] v2533(VarCurr,bitIndex31)| -$F.
% 94.48/93.87  0 [] -v2533(VarCurr,bitIndex30)|$F.
% 94.48/93.87  0 [] v2533(VarCurr,bitIndex30)| -$F.
% 94.48/93.87  0 [] -v2533(VarCurr,bitIndex29)|$F.
% 94.48/93.87  0 [] v2533(VarCurr,bitIndex29)| -$F.
% 94.48/93.87  0 [] -v2533(VarCurr,bitIndex28)|$F.
% 94.48/93.87  0 [] v2533(VarCurr,bitIndex28)| -$F.
% 94.48/93.87  0 [] -range_27_0(B)| -v2534(VarCurr,B)|v2535(VarCurr,B)|v2543(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)|v2534(VarCurr,B)| -v2535(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)|v2534(VarCurr,B)| -v2543(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)| -v2543(VarCurr,B)|v2544(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)| -v2543(VarCurr,B)|v2551(VarCurr,B).
% 94.48/93.87  0 [] -range_27_0(B)|v2543(VarCurr,B)| -v2544(VarCurr,B)| -v2551(VarCurr,B).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -v2551(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] v2551(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex1).
% 94.48/93.87  0 [] -range_25_0(B)| -v2544(VarCurr,B)|v2545(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)|v2544(VarCurr,B)| -v2545(VarCurr,B).
% 94.48/93.87  0 [] -v2544(VarCurr,bitIndex27)|$F.
% 94.48/93.87  0 [] v2544(VarCurr,bitIndex27)| -$F.
% 94.48/93.87  0 [] -v2544(VarCurr,bitIndex26)|$F.
% 94.48/93.87  0 [] v2544(VarCurr,bitIndex26)| -$F.
% 94.48/93.87  0 [] -range_25_0(B)| -v2545(VarCurr,B)|v2546(VarCurr,B)|v2548(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)|v2545(VarCurr,B)| -v2546(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)|v2545(VarCurr,B)| -v2548(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)| -v2548(VarCurr,B)|v2549(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)| -v2548(VarCurr,B)|v2550(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)|v2548(VarCurr,B)| -v2549(VarCurr,B)| -v2550(VarCurr,B).
% 94.48/93.87  0 [] -range_25_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B.
% 94.48/93.87  0 [] range_25_0(B)|bitIndex0!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex1!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex2!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex3!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex4!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex5!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex6!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex7!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex8!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex9!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex10!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex11!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex12!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex13!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex14!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex15!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex16!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex17!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex18!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex19!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex20!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex21!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex22!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex23!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex24!=B.
% 94.48/93.88  0 [] range_25_0(B)|bitIndex25!=B.
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2550(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2550(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex39).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex39).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex38).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex38).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex37).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex37).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex36).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex36).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex35).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex35).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex34).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex34).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex33).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex33).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex32).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex32).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex31).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex31).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex30).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex30).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex29).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex29).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex28).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex28).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex27).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex27).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex26).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex26).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex25).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex25).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex24).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex24).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex23).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex23).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex22).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex22).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex21).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex21).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex20).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex20).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex19).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex19).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex18).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex18).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex17).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex17).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex16).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex16).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex15).
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex15).
% 94.48/93.88  0 [] -v2549(VarCurr,bitIndex25)|$F.
% 94.48/93.88  0 [] v2549(VarCurr,bitIndex25)| -$F.
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex14).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex0)|v2547(VarCurr,bitIndex0).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex14)| -v2547(VarCurr,bitIndex0).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex15).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex1)|v2547(VarCurr,bitIndex1).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex15)| -v2547(VarCurr,bitIndex1).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex16).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex2)|v2547(VarCurr,bitIndex2).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex16)| -v2547(VarCurr,bitIndex2).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex17).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex3)|v2547(VarCurr,bitIndex3).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex17)| -v2547(VarCurr,bitIndex3).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex18).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex4)|v2547(VarCurr,bitIndex4).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex18)| -v2547(VarCurr,bitIndex4).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex19).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex5)|v2547(VarCurr,bitIndex5).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex19)| -v2547(VarCurr,bitIndex5).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex20).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex6)|v2547(VarCurr,bitIndex6).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex20)| -v2547(VarCurr,bitIndex6).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex21).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex7)|v2547(VarCurr,bitIndex7).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex21)| -v2547(VarCurr,bitIndex7).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex22).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex8)|v2547(VarCurr,bitIndex8).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex22)| -v2547(VarCurr,bitIndex8).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex23).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex9)|v2547(VarCurr,bitIndex9).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex23)| -v2547(VarCurr,bitIndex9).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex24).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex10)|v2547(VarCurr,bitIndex10).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex24)| -v2547(VarCurr,bitIndex10).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex25).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex11)|v2547(VarCurr,bitIndex11).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex25)| -v2547(VarCurr,bitIndex11).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex26).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex12)|v2547(VarCurr,bitIndex12).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex26)| -v2547(VarCurr,bitIndex12).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex27).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex13)|v2547(VarCurr,bitIndex13).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex27)| -v2547(VarCurr,bitIndex13).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex28).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex14)|v2547(VarCurr,bitIndex14).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex28)| -v2547(VarCurr,bitIndex14).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex29).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex15)|v2547(VarCurr,bitIndex15).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex29)| -v2547(VarCurr,bitIndex15).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex30).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex16)|v2547(VarCurr,bitIndex16).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex30)| -v2547(VarCurr,bitIndex16).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex31).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex17)|v2547(VarCurr,bitIndex17).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex31)| -v2547(VarCurr,bitIndex17).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex32).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex18)|v2547(VarCurr,bitIndex18).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex32)| -v2547(VarCurr,bitIndex18).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex33).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex19)|v2547(VarCurr,bitIndex19).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex33)| -v2547(VarCurr,bitIndex19).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex34).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex20)|v2547(VarCurr,bitIndex20).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex34)| -v2547(VarCurr,bitIndex20).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex35).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex21)|v2547(VarCurr,bitIndex21).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex35)| -v2547(VarCurr,bitIndex21).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex36).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex22)|v2547(VarCurr,bitIndex22).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex36)| -v2547(VarCurr,bitIndex22).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex37).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex23)|v2547(VarCurr,bitIndex23).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex37)| -v2547(VarCurr,bitIndex23).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex38).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex24)|v2547(VarCurr,bitIndex24).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex38)| -v2547(VarCurr,bitIndex24).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex39).
% 94.48/93.88  0 [] -v2546(VarCurr,bitIndex25)|v2547(VarCurr,bitIndex25).
% 94.48/93.88  0 [] v2546(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex39)| -v2547(VarCurr,bitIndex25).
% 94.48/93.88  0 [] -v2547(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.48/93.88  0 [] v2547(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.53/93.89  0 [] -v2547(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.53/93.89  0 [] v2547(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.53/93.89  0 [] -range_27_0(B)| -v2535(VarCurr,B)|v2536(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)| -v2535(VarCurr,B)|v2542(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)|v2535(VarCurr,B)| -v2536(VarCurr,B)| -v2542(VarCurr,B).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex20)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex20)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex21)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex21)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex22)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex22)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex23)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex23)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex24)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex24)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex25)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex25)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex26)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex26)| -v2473(VarCurr).
% 94.53/93.89  0 [] -v2542(VarCurr,bitIndex27)|v2473(VarCurr).
% 94.53/93.89  0 [] v2542(VarCurr,bitIndex27)| -v2473(VarCurr).
% 94.53/93.89  0 [] -range_27_0(B)| -v2536(VarCurr,B)|v2537(VarCurr,B)|v2539(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)|v2536(VarCurr,B)| -v2537(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)|v2536(VarCurr,B)| -v2539(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)| -v2539(VarCurr,B)|v2540(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)| -v2539(VarCurr,B)|v2541(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)|v2539(VarCurr,B)| -v2540(VarCurr,B)| -v2541(VarCurr,B).
% 94.53/93.89  0 [] -range_27_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex0!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex1!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex2!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex3!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex4!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex5!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex6!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex7!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex8!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex9!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex10!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex11!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex12!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex13!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex14!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex15!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex16!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex17!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex18!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex19!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex20!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex21!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex22!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex23!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex24!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex25!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex26!=B.
% 94.53/93.89  0 [] range_27_0(B)|bitIndex27!=B.
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2541(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2541(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex39).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex39).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex38).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex38).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex37).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex37).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex36).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex36).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex35).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex35).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex34).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex34).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex33).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex33).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex32).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex32).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex31).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex31).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex30).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex30).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex29).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex29).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex28).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex28).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex27).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex27).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex26).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex26).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex25).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex25).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex24).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex24).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex23).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex23).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex22).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex22).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex21).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex21).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex20).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex20).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex19).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex19).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex18).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex18).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex17).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex17).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex16).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex16).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex15).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex15).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex14).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex14).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex13).
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex13).
% 94.53/93.89  0 [] -v2540(VarCurr,bitIndex27)|$F.
% 94.53/93.89  0 [] v2540(VarCurr,bitIndex27)| -$F.
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex12).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex0)|v2538(VarCurr,bitIndex0).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex12)| -v2538(VarCurr,bitIndex0).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex13).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex1)|v2538(VarCurr,bitIndex1).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex13)| -v2538(VarCurr,bitIndex1).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex14).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex2)|v2538(VarCurr,bitIndex2).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex14)| -v2538(VarCurr,bitIndex2).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex15).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex3)|v2538(VarCurr,bitIndex3).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex15)| -v2538(VarCurr,bitIndex3).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex16).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex4)|v2538(VarCurr,bitIndex4).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex16)| -v2538(VarCurr,bitIndex4).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex17).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex5)|v2538(VarCurr,bitIndex5).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex17)| -v2538(VarCurr,bitIndex5).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex18).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex6)|v2538(VarCurr,bitIndex6).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex18)| -v2538(VarCurr,bitIndex6).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex19).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex7)|v2538(VarCurr,bitIndex7).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex19)| -v2538(VarCurr,bitIndex7).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex20).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex8)|v2538(VarCurr,bitIndex8).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex20)| -v2538(VarCurr,bitIndex8).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex21).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex9)|v2538(VarCurr,bitIndex9).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex21)| -v2538(VarCurr,bitIndex9).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex22).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex10)|v2538(VarCurr,bitIndex10).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex22)| -v2538(VarCurr,bitIndex10).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex23).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex11)|v2538(VarCurr,bitIndex11).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex23)| -v2538(VarCurr,bitIndex11).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex24).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex12)|v2538(VarCurr,bitIndex12).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex24)| -v2538(VarCurr,bitIndex12).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex25).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex13)|v2538(VarCurr,bitIndex13).
% 94.53/93.89  0 [] v2537(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex25)| -v2538(VarCurr,bitIndex13).
% 94.53/93.89  0 [] -v2537(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex26).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex14)|v2538(VarCurr,bitIndex14).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex26)| -v2538(VarCurr,bitIndex14).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex27).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex15)|v2538(VarCurr,bitIndex15).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex27)| -v2538(VarCurr,bitIndex15).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex28).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex16)|v2538(VarCurr,bitIndex16).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex28)| -v2538(VarCurr,bitIndex16).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex29).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex17)|v2538(VarCurr,bitIndex17).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex29)| -v2538(VarCurr,bitIndex17).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex30).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex18)|v2538(VarCurr,bitIndex18).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex30)| -v2538(VarCurr,bitIndex18).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex31).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex19)|v2538(VarCurr,bitIndex19).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex31)| -v2538(VarCurr,bitIndex19).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex32).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex20)|v2538(VarCurr,bitIndex20).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex32)| -v2538(VarCurr,bitIndex20).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex33).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex21)|v2538(VarCurr,bitIndex21).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex33)| -v2538(VarCurr,bitIndex21).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex34).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex22)|v2538(VarCurr,bitIndex22).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex34)| -v2538(VarCurr,bitIndex22).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex35).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex23)|v2538(VarCurr,bitIndex23).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex35)| -v2538(VarCurr,bitIndex23).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex36).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex24)|v2538(VarCurr,bitIndex24).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex36)| -v2538(VarCurr,bitIndex24).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex37).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex25)|v2538(VarCurr,bitIndex25).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex37)| -v2538(VarCurr,bitIndex25).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex38).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex26)|v2538(VarCurr,bitIndex26).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex38)| -v2538(VarCurr,bitIndex26).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex39).
% 94.53/93.90  0 [] -v2537(VarCurr,bitIndex27)|v2538(VarCurr,bitIndex27).
% 94.53/93.90  0 [] v2537(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex39)| -v2538(VarCurr,bitIndex27).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.53/93.90  0 [] -v2538(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.53/93.90  0 [] v2538(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.53/93.90  0 [] -range_31_0(B)| -v2512(VarCurr,B)|v2513(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)| -v2512(VarCurr,B)|v2531(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)|v2512(VarCurr,B)| -v2513(VarCurr,B)| -v2531(VarCurr,B).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex0)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex0)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex1)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex1)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex2)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex2)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex3)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex3)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex4)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex4)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex5)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex5)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex6)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex6)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex7)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex7)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex8)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex8)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex9)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex9)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex10)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex10)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex11)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex11)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex12)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex12)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex13)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex13)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex14)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex14)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex15)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex15)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex16)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex16)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex17)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex17)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex18)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex18)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex19)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex19)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex20)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex20)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex21)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex21)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex22)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex22)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex23)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex23)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex24)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex24)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex25)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex25)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex26)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex26)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex27)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex27)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex28)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex28)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex29)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex29)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex30)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex30)| -v2484(VarCurr).
% 94.53/93.90  0 [] -v2531(VarCurr,bitIndex31)|v2484(VarCurr).
% 94.53/93.90  0 [] v2531(VarCurr,bitIndex31)| -v2484(VarCurr).
% 94.53/93.90  0 [] -range_31_0(B)| -v2513(VarCurr,B)|v2514(VarCurr,B)|v2522(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)|v2513(VarCurr,B)| -v2514(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)|v2513(VarCurr,B)| -v2522(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)| -v2522(VarCurr,B)|v2523(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)| -v2522(VarCurr,B)|v2530(VarCurr,B).
% 94.53/93.90  0 [] -range_31_0(B)|v2522(VarCurr,B)| -v2523(VarCurr,B)| -v2530(VarCurr,B).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] -v2530(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex1).
% 94.53/93.90  0 [] v2530(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2530(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2530(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2530(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2530(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2530(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2530(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2530(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2530(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2530(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2530(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -range_29_0(B)| -v2523(VarCurr,B)|v2524(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)|v2523(VarCurr,B)| -v2524(VarCurr,B).
% 94.53/93.91  0 [] -v2523(VarCurr,bitIndex31)|$F.
% 94.53/93.91  0 [] v2523(VarCurr,bitIndex31)| -$F.
% 94.53/93.91  0 [] -v2523(VarCurr,bitIndex30)|$F.
% 94.53/93.91  0 [] v2523(VarCurr,bitIndex30)| -$F.
% 94.53/93.91  0 [] -range_29_0(B)| -v2524(VarCurr,B)|v2525(VarCurr,B)|v2527(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)|v2524(VarCurr,B)| -v2525(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)|v2524(VarCurr,B)| -v2527(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)| -v2527(VarCurr,B)|v2528(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)| -v2527(VarCurr,B)|v2529(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)|v2527(VarCurr,B)| -v2528(VarCurr,B)| -v2529(VarCurr,B).
% 94.53/93.91  0 [] -range_29_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex0!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex1!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex2!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex3!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex4!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex5!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex6!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex7!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex8!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex9!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex10!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex11!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex12!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex13!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex14!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex15!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex16!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex17!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex18!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex19!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex20!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex21!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex22!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex23!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex24!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex25!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex26!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex27!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex28!=B.
% 94.53/93.91  0 [] range_29_0(B)|bitIndex29!=B.
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2529(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2529(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex39).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex39).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex38).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex38).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex37).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex37).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex36).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex36).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex35).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex35).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex34).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex34).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex33).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex33).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex32).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex32).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex31).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex31).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex30).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex30).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex29).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex29).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex28).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex28).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex27).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex27).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex26).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex26).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex25).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex25).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex24).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex24).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex23).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex23).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex22).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex22).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex21).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex21).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex20).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex20).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex19).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex19).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex18).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex18).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex17).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex17).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex16).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex16).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex15).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex15).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex14).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex14).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex13).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex13).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex12).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex12).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex11).
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex11).
% 94.53/93.91  0 [] -v2528(VarCurr,bitIndex29)|$F.
% 94.53/93.91  0 [] v2528(VarCurr,bitIndex29)| -$F.
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex10).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex0)|v2526(VarCurr,bitIndex0).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex10)| -v2526(VarCurr,bitIndex0).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex11).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex1)|v2526(VarCurr,bitIndex1).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex11)| -v2526(VarCurr,bitIndex1).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex12).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex2)|v2526(VarCurr,bitIndex2).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex12)| -v2526(VarCurr,bitIndex2).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex13).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex3)|v2526(VarCurr,bitIndex3).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex13)| -v2526(VarCurr,bitIndex3).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex14).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex4)|v2526(VarCurr,bitIndex4).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex14)| -v2526(VarCurr,bitIndex4).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex15).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex5)|v2526(VarCurr,bitIndex5).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex15)| -v2526(VarCurr,bitIndex5).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex16).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex6)|v2526(VarCurr,bitIndex6).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex16)| -v2526(VarCurr,bitIndex6).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex17).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex7)|v2526(VarCurr,bitIndex7).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex17)| -v2526(VarCurr,bitIndex7).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex18).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex8)|v2526(VarCurr,bitIndex8).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex18)| -v2526(VarCurr,bitIndex8).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex19).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex9)|v2526(VarCurr,bitIndex9).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex19)| -v2526(VarCurr,bitIndex9).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex20).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex10)|v2526(VarCurr,bitIndex10).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex20)| -v2526(VarCurr,bitIndex10).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex21).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex11)|v2526(VarCurr,bitIndex11).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex21)| -v2526(VarCurr,bitIndex11).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex22).
% 94.53/93.91  0 [] -v2525(VarCurr,bitIndex12)|v2526(VarCurr,bitIndex12).
% 94.53/93.91  0 [] v2525(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex22)| -v2526(VarCurr,bitIndex12).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex23).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex13)|v2526(VarCurr,bitIndex13).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex23)| -v2526(VarCurr,bitIndex13).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex24).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex14)|v2526(VarCurr,bitIndex14).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex24)| -v2526(VarCurr,bitIndex14).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex25).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex15)|v2526(VarCurr,bitIndex15).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex25)| -v2526(VarCurr,bitIndex15).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex26).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex16)|v2526(VarCurr,bitIndex16).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex26)| -v2526(VarCurr,bitIndex16).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex27).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex17)|v2526(VarCurr,bitIndex17).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex27)| -v2526(VarCurr,bitIndex17).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex28).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex18)|v2526(VarCurr,bitIndex18).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex28)| -v2526(VarCurr,bitIndex18).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex29).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex19)|v2526(VarCurr,bitIndex19).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex29)| -v2526(VarCurr,bitIndex19).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex30).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex20)|v2526(VarCurr,bitIndex20).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex30)| -v2526(VarCurr,bitIndex20).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex31).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex21)|v2526(VarCurr,bitIndex21).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex31)| -v2526(VarCurr,bitIndex21).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex32).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex22)|v2526(VarCurr,bitIndex22).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex32)| -v2526(VarCurr,bitIndex22).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex33).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex23)|v2526(VarCurr,bitIndex23).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex33)| -v2526(VarCurr,bitIndex23).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex34).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex24)|v2526(VarCurr,bitIndex24).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex34)| -v2526(VarCurr,bitIndex24).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex35).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex25)|v2526(VarCurr,bitIndex25).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex35)| -v2526(VarCurr,bitIndex25).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex36).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex26)|v2526(VarCurr,bitIndex26).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex36)| -v2526(VarCurr,bitIndex26).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex37).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex27)|v2526(VarCurr,bitIndex27).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex37)| -v2526(VarCurr,bitIndex27).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex38).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex28)|v2526(VarCurr,bitIndex28).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex38)| -v2526(VarCurr,bitIndex28).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex39).
% 94.53/93.92  0 [] -v2525(VarCurr,bitIndex29)|v2526(VarCurr,bitIndex29).
% 94.53/93.92  0 [] v2525(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex39)| -v2526(VarCurr,bitIndex29).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.53/93.92  0 [] -v2526(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.53/93.92  0 [] v2526(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.53/93.92  0 [] -range_31_0(B)| -v2514(VarCurr,B)|v2515(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)| -v2514(VarCurr,B)|v2521(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)|v2514(VarCurr,B)| -v2515(VarCurr,B)| -v2521(VarCurr,B).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex20)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex20)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex21)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex21)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex22)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex22)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex23)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex23)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex24)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex24)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex25)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex25)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex26)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex26)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex27)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex27)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex28)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex28)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex29)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex29)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex30)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex30)| -v2473(VarCurr).
% 94.53/93.92  0 [] -v2521(VarCurr,bitIndex31)|v2473(VarCurr).
% 94.53/93.92  0 [] v2521(VarCurr,bitIndex31)| -v2473(VarCurr).
% 94.53/93.92  0 [] -range_31_0(B)| -v2515(VarCurr,B)|v2516(VarCurr,B)|v2518(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)|v2515(VarCurr,B)| -v2516(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)|v2515(VarCurr,B)| -v2518(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)| -v2518(VarCurr,B)|v2519(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)| -v2518(VarCurr,B)|v2520(VarCurr,B).
% 94.53/93.92  0 [] -range_31_0(B)|v2518(VarCurr,B)| -v2519(VarCurr,B)| -v2520(VarCurr,B).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2520(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] v2520(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex0).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex39).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex39).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex38).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex38).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex37).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex37).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex36).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex36).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex35).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex35).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex34).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex34).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex33).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex33).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex32).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex32).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex31).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex31).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex30).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex30).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex29).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex29).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex28).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex28).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex27).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex27).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex26).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex26).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex25).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex25).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex24).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex24).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex23).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex23).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex22).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex22).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex21).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex21).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex20).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex20).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex19).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex19).
% 94.53/93.92  0 [] -v2519(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex18).
% 94.53/93.92  0 [] v2519(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex18).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex17).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex17).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex16).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex16).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex15).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex15).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex14).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex14).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex13).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex13).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex12).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex12).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex11).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex11).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex10).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex10).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex9).
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex9).
% 94.53/93.93  0 [] -v2519(VarCurr,bitIndex31)|$F.
% 94.53/93.93  0 [] v2519(VarCurr,bitIndex31)| -$F.
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex8).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex0)|v2517(VarCurr,bitIndex0).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex8)| -v2517(VarCurr,bitIndex0).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex9).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex1)|v2517(VarCurr,bitIndex1).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex9)| -v2517(VarCurr,bitIndex1).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex10).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex2)|v2517(VarCurr,bitIndex2).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex10)| -v2517(VarCurr,bitIndex2).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex11).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex3)|v2517(VarCurr,bitIndex3).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex11)| -v2517(VarCurr,bitIndex3).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex12).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex4)|v2517(VarCurr,bitIndex4).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex12)| -v2517(VarCurr,bitIndex4).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex13).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex5)|v2517(VarCurr,bitIndex5).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex13)| -v2517(VarCurr,bitIndex5).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex14).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex6)|v2517(VarCurr,bitIndex6).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex14)| -v2517(VarCurr,bitIndex6).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex15).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex7)|v2517(VarCurr,bitIndex7).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex15)| -v2517(VarCurr,bitIndex7).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex16).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex8)|v2517(VarCurr,bitIndex8).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex16)| -v2517(VarCurr,bitIndex8).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex17).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex9)|v2517(VarCurr,bitIndex9).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex17)| -v2517(VarCurr,bitIndex9).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex18).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex10)|v2517(VarCurr,bitIndex10).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex18)| -v2517(VarCurr,bitIndex10).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex19).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex11)|v2517(VarCurr,bitIndex11).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex19)| -v2517(VarCurr,bitIndex11).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex20).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex12)|v2517(VarCurr,bitIndex12).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex20)| -v2517(VarCurr,bitIndex12).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex21).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex13)|v2517(VarCurr,bitIndex13).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex21)| -v2517(VarCurr,bitIndex13).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex22).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex14)|v2517(VarCurr,bitIndex14).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex22)| -v2517(VarCurr,bitIndex14).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex23).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex15)|v2517(VarCurr,bitIndex15).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex23)| -v2517(VarCurr,bitIndex15).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex24).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex16)|v2517(VarCurr,bitIndex16).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex24)| -v2517(VarCurr,bitIndex16).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex25).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex17)|v2517(VarCurr,bitIndex17).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex25)| -v2517(VarCurr,bitIndex17).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex26).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex18)|v2517(VarCurr,bitIndex18).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex26)| -v2517(VarCurr,bitIndex18).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex27).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex19)|v2517(VarCurr,bitIndex19).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex27)| -v2517(VarCurr,bitIndex19).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex28).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex20)|v2517(VarCurr,bitIndex20).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex28)| -v2517(VarCurr,bitIndex20).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex29).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex21)|v2517(VarCurr,bitIndex21).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex29)| -v2517(VarCurr,bitIndex21).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex30).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex22)|v2517(VarCurr,bitIndex22).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex30)| -v2517(VarCurr,bitIndex22).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex31).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex23)|v2517(VarCurr,bitIndex23).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex31)| -v2517(VarCurr,bitIndex23).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex32).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex24)|v2517(VarCurr,bitIndex24).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex32)| -v2517(VarCurr,bitIndex24).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex33).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex25)|v2517(VarCurr,bitIndex25).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex33)| -v2517(VarCurr,bitIndex25).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex34).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex26)|v2517(VarCurr,bitIndex26).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex34)| -v2517(VarCurr,bitIndex26).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex35).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex27)|v2517(VarCurr,bitIndex27).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex35)| -v2517(VarCurr,bitIndex27).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex36).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex28)|v2517(VarCurr,bitIndex28).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex36)| -v2517(VarCurr,bitIndex28).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex37).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex29)|v2517(VarCurr,bitIndex29).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex37)| -v2517(VarCurr,bitIndex29).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex38).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex30)|v2517(VarCurr,bitIndex30).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex38)| -v2517(VarCurr,bitIndex30).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex39).
% 94.53/93.93  0 [] -v2516(VarCurr,bitIndex31)|v2517(VarCurr,bitIndex31).
% 94.53/93.93  0 [] v2516(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex39)| -v2517(VarCurr,bitIndex31).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex30)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex30)| -v2468(VarCurr).
% 94.53/93.93  0 [] -v2517(VarCurr,bitIndex31)|v2468(VarCurr).
% 94.53/93.93  0 [] v2517(VarCurr,bitIndex31)| -v2468(VarCurr).
% 94.53/93.93  0 [] -range_39_0(B)| -v2458(VarCurr,B)|v2459(VarCurr,B).
% 94.53/93.93  0 [] -range_39_0(B)| -v2458(VarCurr,B)|v2506(VarCurr,B).
% 94.53/93.93  0 [] -range_39_0(B)|v2458(VarCurr,B)| -v2459(VarCurr,B)| -v2506(VarCurr,B).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex0)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex0)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex1)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex1)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex2)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex2)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex3)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex3)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex4)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex4)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex5)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex5)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex6)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex6)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex7)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex7)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex8)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex8)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex9)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex9)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex10)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex10)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex11)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex11)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex12)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex12)| -v2507(VarCurr).
% 94.53/93.93  0 [] -v2506(VarCurr,bitIndex13)|v2507(VarCurr).
% 94.53/93.93  0 [] v2506(VarCurr,bitIndex13)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex14)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex14)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex15)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex15)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex16)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex16)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex17)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex17)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex18)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex18)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex19)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex19)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex20)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex20)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex21)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex21)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex22)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex22)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex23)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex23)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex24)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex24)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex25)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex25)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex26)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex26)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex27)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex27)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex28)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex28)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex29)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex29)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex30)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex30)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex31)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex31)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex32)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex32)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex33)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex33)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex34)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex34)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex35)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex35)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex36)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex36)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex37)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex37)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex38)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex38)| -v2507(VarCurr).
% 94.53/93.94  0 [] -v2506(VarCurr,bitIndex39)|v2507(VarCurr).
% 94.53/93.94  0 [] v2506(VarCurr,bitIndex39)| -v2507(VarCurr).
% 94.53/93.94  0 [] v2507(VarCurr)|v2453(VarCurr,bitIndex3).
% 94.53/93.94  0 [] -v2507(VarCurr)| -v2453(VarCurr,bitIndex3).
% 94.53/93.94  0 [] -range_39_0(B)| -v2459(VarCurr,B)|v2460(VarCurr,B)|v2485(VarCurr,B).
% 94.53/93.94  0 [] -range_39_0(B)|v2459(VarCurr,B)| -v2460(VarCurr,B).
% 94.53/93.94  0 [] -range_39_0(B)|v2459(VarCurr,B)| -v2485(VarCurr,B).
% 94.53/93.94  0 [] -range_39_0(B)| -v2485(VarCurr,B)|v2486(VarCurr,B).
% 94.53/93.94  0 [] -range_39_0(B)| -v2485(VarCurr,B)|v2505(VarCurr,B).
% 94.53/93.94  0 [] -range_39_0(B)|v2485(VarCurr,B)| -v2486(VarCurr,B)| -v2505(VarCurr,B).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex38)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex38)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -v2505(VarCurr,bitIndex39)|v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] v2505(VarCurr,bitIndex39)| -v2453(VarCurr,bitIndex2).
% 94.53/93.94  0 [] -range_35_0(B)| -v2486(VarCurr,B)|v2487(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)|v2486(VarCurr,B)| -v2487(VarCurr,B).
% 94.53/93.94  0 [] -v2486(VarCurr,bitIndex39)|$F.
% 94.53/93.94  0 [] v2486(VarCurr,bitIndex39)| -$F.
% 94.53/93.94  0 [] -v2486(VarCurr,bitIndex38)|$F.
% 94.53/93.94  0 [] v2486(VarCurr,bitIndex38)| -$F.
% 94.53/93.94  0 [] -v2486(VarCurr,bitIndex37)|$F.
% 94.53/93.94  0 [] v2486(VarCurr,bitIndex37)| -$F.
% 94.53/93.94  0 [] -v2486(VarCurr,bitIndex36)|$F.
% 94.53/93.94  0 [] v2486(VarCurr,bitIndex36)| -$F.
% 94.53/93.94  0 [] -range_35_0(B)| -v2487(VarCurr,B)|v2488(VarCurr,B)|v2496(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)|v2487(VarCurr,B)| -v2488(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)|v2487(VarCurr,B)| -v2496(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)| -v2496(VarCurr,B)|v2497(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)| -v2496(VarCurr,B)|v2504(VarCurr,B).
% 94.53/93.94  0 [] -range_35_0(B)|v2496(VarCurr,B)| -v2497(VarCurr,B)| -v2504(VarCurr,B).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] v2504(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex1).
% 94.53/93.94  0 [] -v2504(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] v2504(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] -v2504(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] v2504(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] -v2504(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] v2504(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex1).
% 94.60/93.95  0 [] -range_33_0(B)| -v2497(VarCurr,B)|v2498(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)|v2497(VarCurr,B)| -v2498(VarCurr,B).
% 94.60/93.95  0 [] -v2497(VarCurr,bitIndex35)|$F.
% 94.60/93.95  0 [] v2497(VarCurr,bitIndex35)| -$F.
% 94.60/93.95  0 [] -v2497(VarCurr,bitIndex34)|$F.
% 94.60/93.95  0 [] v2497(VarCurr,bitIndex34)| -$F.
% 94.60/93.95  0 [] -range_33_0(B)| -v2498(VarCurr,B)|v2499(VarCurr,B)|v2501(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)|v2498(VarCurr,B)| -v2499(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)|v2498(VarCurr,B)| -v2501(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)| -v2501(VarCurr,B)|v2502(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)| -v2501(VarCurr,B)|v2503(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)|v2501(VarCurr,B)| -v2502(VarCurr,B)| -v2503(VarCurr,B).
% 94.60/93.95  0 [] -range_33_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex0!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex1!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex2!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex3!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex4!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex5!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex6!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex7!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex8!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex9!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex10!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex11!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex12!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex13!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex14!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex15!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex16!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex17!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex18!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex19!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex20!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex21!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex22!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex23!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex24!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex25!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex26!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex27!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex28!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex29!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex30!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex31!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex32!=B.
% 94.60/93.95  0 [] range_33_0(B)|bitIndex33!=B.
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2503(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2503(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex39).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex39).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex38).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex38).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex37).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex37).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex36).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex36).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex35).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex35).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex34).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex34).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex33).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex33).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex32).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex32).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex31).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex31).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex30).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex30).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex29).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex29).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex28).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex28).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex27).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex27).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex26).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex26).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex25).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex25).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex24).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex24).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex23).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex23).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex22).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex22).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex21).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex21).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex20).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex20).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex19).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex19).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex18).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex18).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex17).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex17).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex16).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex16).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex15).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex15).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex14).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex14).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex13).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex13).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex12).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex12).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex11).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex11).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex10).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex10).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex9).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex9).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex8).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex8).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex7).
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex7).
% 94.60/93.95  0 [] -v2502(VarCurr,bitIndex33)|$F.
% 94.60/93.95  0 [] v2502(VarCurr,bitIndex33)| -$F.
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex6).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex0)|v2500(VarCurr,bitIndex0).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex6)| -v2500(VarCurr,bitIndex0).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex7).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex1)|v2500(VarCurr,bitIndex1).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex7)| -v2500(VarCurr,bitIndex1).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex8).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex2)|v2500(VarCurr,bitIndex2).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex8)| -v2500(VarCurr,bitIndex2).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex9).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex3)|v2500(VarCurr,bitIndex3).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex9)| -v2500(VarCurr,bitIndex3).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex10).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex4)|v2500(VarCurr,bitIndex4).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex10)| -v2500(VarCurr,bitIndex4).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex11).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex5)|v2500(VarCurr,bitIndex5).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex11)| -v2500(VarCurr,bitIndex5).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex12).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex6)|v2500(VarCurr,bitIndex6).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex12)| -v2500(VarCurr,bitIndex6).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex13).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex7)|v2500(VarCurr,bitIndex7).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex13)| -v2500(VarCurr,bitIndex7).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex14).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex8)|v2500(VarCurr,bitIndex8).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex14)| -v2500(VarCurr,bitIndex8).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex15).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex9)|v2500(VarCurr,bitIndex9).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex15)| -v2500(VarCurr,bitIndex9).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex16).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex10)|v2500(VarCurr,bitIndex10).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex16)| -v2500(VarCurr,bitIndex10).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex17).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex11)|v2500(VarCurr,bitIndex11).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex17)| -v2500(VarCurr,bitIndex11).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex18).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex12)|v2500(VarCurr,bitIndex12).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex18)| -v2500(VarCurr,bitIndex12).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex19).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex13)|v2500(VarCurr,bitIndex13).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex19)| -v2500(VarCurr,bitIndex13).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex20).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex14)|v2500(VarCurr,bitIndex14).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex20)| -v2500(VarCurr,bitIndex14).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex21).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex15)|v2500(VarCurr,bitIndex15).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex21)| -v2500(VarCurr,bitIndex15).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex22).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex16)|v2500(VarCurr,bitIndex16).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex22)| -v2500(VarCurr,bitIndex16).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex23).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex17)|v2500(VarCurr,bitIndex17).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex23)| -v2500(VarCurr,bitIndex17).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex24).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex18)|v2500(VarCurr,bitIndex18).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex24)| -v2500(VarCurr,bitIndex18).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex25).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex19)|v2500(VarCurr,bitIndex19).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex25)| -v2500(VarCurr,bitIndex19).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex26).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex20)|v2500(VarCurr,bitIndex20).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex26)| -v2500(VarCurr,bitIndex20).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex27).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex21)|v2500(VarCurr,bitIndex21).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex27)| -v2500(VarCurr,bitIndex21).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex28).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex22)|v2500(VarCurr,bitIndex22).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex28)| -v2500(VarCurr,bitIndex22).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex29).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex23)|v2500(VarCurr,bitIndex23).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex29)| -v2500(VarCurr,bitIndex23).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex30).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex24)|v2500(VarCurr,bitIndex24).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex30)| -v2500(VarCurr,bitIndex24).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex31).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex25)|v2500(VarCurr,bitIndex25).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex31)| -v2500(VarCurr,bitIndex25).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex32).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex26)|v2500(VarCurr,bitIndex26).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex32)| -v2500(VarCurr,bitIndex26).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex33).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex27)|v2500(VarCurr,bitIndex27).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex33)| -v2500(VarCurr,bitIndex27).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex34).
% 94.60/93.95  0 [] -v2499(VarCurr,bitIndex28)|v2500(VarCurr,bitIndex28).
% 94.60/93.95  0 [] v2499(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex34)| -v2500(VarCurr,bitIndex28).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex35).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex29)|v2500(VarCurr,bitIndex29).
% 94.61/93.96  0 [] v2499(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex35)| -v2500(VarCurr,bitIndex29).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex36).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex30)|v2500(VarCurr,bitIndex30).
% 94.61/93.96  0 [] v2499(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex36)| -v2500(VarCurr,bitIndex30).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex37).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex31)|v2500(VarCurr,bitIndex31).
% 94.61/93.96  0 [] v2499(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex37)| -v2500(VarCurr,bitIndex31).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex38).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex32)|v2500(VarCurr,bitIndex32).
% 94.61/93.96  0 [] v2499(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex38)| -v2500(VarCurr,bitIndex32).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex39).
% 94.61/93.96  0 [] -v2499(VarCurr,bitIndex33)|v2500(VarCurr,bitIndex33).
% 94.61/93.96  0 [] v2499(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex39)| -v2500(VarCurr,bitIndex33).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex30)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex30)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex31)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex31)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex32)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex32)| -v2468(VarCurr).
% 94.61/93.96  0 [] -v2500(VarCurr,bitIndex33)|v2468(VarCurr).
% 94.61/93.96  0 [] v2500(VarCurr,bitIndex33)| -v2468(VarCurr).
% 94.61/93.96  0 [] -range_35_0(B)| -v2488(VarCurr,B)|v2489(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)| -v2488(VarCurr,B)|v2495(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)|v2488(VarCurr,B)| -v2489(VarCurr,B)| -v2495(VarCurr,B).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex20)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex20)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex21)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex21)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex22)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex22)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex23)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex23)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex24)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex24)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex25)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex25)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex26)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex26)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex27)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex27)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex28)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex28)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex29)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex29)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex30)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex30)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex31)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex31)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex32)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex32)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex33)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex33)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex34)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex34)| -v2473(VarCurr).
% 94.61/93.96  0 [] -v2495(VarCurr,bitIndex35)|v2473(VarCurr).
% 94.61/93.96  0 [] v2495(VarCurr,bitIndex35)| -v2473(VarCurr).
% 94.61/93.96  0 [] -range_35_0(B)| -v2489(VarCurr,B)|v2490(VarCurr,B)|v2492(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)|v2489(VarCurr,B)| -v2490(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)|v2489(VarCurr,B)| -v2492(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)| -v2492(VarCurr,B)|v2493(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)| -v2492(VarCurr,B)|v2494(VarCurr,B).
% 94.61/93.96  0 [] -range_35_0(B)|v2492(VarCurr,B)| -v2493(VarCurr,B)| -v2494(VarCurr,B).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] v2494(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex0).
% 94.61/93.96  0 [] -v2494(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] v2494(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] -v2494(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] v2494(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] -v2494(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] v2494(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex0).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex34)|v2465(VarCurr,bitIndex39).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex34)| -v2465(VarCurr,bitIndex39).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex38).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex38).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex37).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex37).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex36).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex36).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex35).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex35).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex34).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex34).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex33).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex33).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex32).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex32).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex31).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex31).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex30).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex30).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex29).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex29).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex28).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex28).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex27).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex27).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex26).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex26).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex25).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex25).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex24).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex24).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex23).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex23).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex22).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex22).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex21).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex21).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex20).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex20).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex19).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex19).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex18).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex18).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex17).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex17).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex16).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex16).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex15).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex15).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex14).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex14).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex13).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex13).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex12).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex12).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex11).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex11).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex10).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex10).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex9).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex9).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex8).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex8).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex7).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex7).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex6).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex6).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex5).
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex5).
% 94.61/93.97  0 [] -v2493(VarCurr,bitIndex35)|$F.
% 94.61/93.97  0 [] v2493(VarCurr,bitIndex35)| -$F.
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex4).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex0)|v2491(VarCurr,bitIndex0).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex4)| -v2491(VarCurr,bitIndex0).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex5).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex1)|v2491(VarCurr,bitIndex1).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex5)| -v2491(VarCurr,bitIndex1).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex6).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex2)|v2491(VarCurr,bitIndex2).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex6)| -v2491(VarCurr,bitIndex2).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex7).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex3)|v2491(VarCurr,bitIndex3).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex7)| -v2491(VarCurr,bitIndex3).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex8).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex4)|v2491(VarCurr,bitIndex4).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex8)| -v2491(VarCurr,bitIndex4).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex9).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex5)|v2491(VarCurr,bitIndex5).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex9)| -v2491(VarCurr,bitIndex5).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex10).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex6)|v2491(VarCurr,bitIndex6).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex10)| -v2491(VarCurr,bitIndex6).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex11).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex7)|v2491(VarCurr,bitIndex7).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex11)| -v2491(VarCurr,bitIndex7).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex12).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex8)|v2491(VarCurr,bitIndex8).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex12)| -v2491(VarCurr,bitIndex8).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex13).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex9)|v2491(VarCurr,bitIndex9).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex13)| -v2491(VarCurr,bitIndex9).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex14).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex10)|v2491(VarCurr,bitIndex10).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex14)| -v2491(VarCurr,bitIndex10).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex15).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex11)|v2491(VarCurr,bitIndex11).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex15)| -v2491(VarCurr,bitIndex11).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex16).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex12)|v2491(VarCurr,bitIndex12).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex16)| -v2491(VarCurr,bitIndex12).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex17).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex13)|v2491(VarCurr,bitIndex13).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex17)| -v2491(VarCurr,bitIndex13).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex18).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex14)|v2491(VarCurr,bitIndex14).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex18)| -v2491(VarCurr,bitIndex14).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex19).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex15)|v2491(VarCurr,bitIndex15).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex19)| -v2491(VarCurr,bitIndex15).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex20).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex16)|v2491(VarCurr,bitIndex16).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex20)| -v2491(VarCurr,bitIndex16).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex21).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex17)|v2491(VarCurr,bitIndex17).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex21)| -v2491(VarCurr,bitIndex17).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex22).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex18)|v2491(VarCurr,bitIndex18).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex22)| -v2491(VarCurr,bitIndex18).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex23).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex19)|v2491(VarCurr,bitIndex19).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex23)| -v2491(VarCurr,bitIndex19).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex24).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex20)|v2491(VarCurr,bitIndex20).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex24)| -v2491(VarCurr,bitIndex20).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex25).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex21)|v2491(VarCurr,bitIndex21).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex25)| -v2491(VarCurr,bitIndex21).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex26).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex22)|v2491(VarCurr,bitIndex22).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex26)| -v2491(VarCurr,bitIndex22).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex27).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex23)|v2491(VarCurr,bitIndex23).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex27)| -v2491(VarCurr,bitIndex23).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex28).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex24)|v2491(VarCurr,bitIndex24).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex28)| -v2491(VarCurr,bitIndex24).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex29).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex25)|v2491(VarCurr,bitIndex25).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex29)| -v2491(VarCurr,bitIndex25).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex30).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex26)|v2491(VarCurr,bitIndex26).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex30)| -v2491(VarCurr,bitIndex26).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex31).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex27)|v2491(VarCurr,bitIndex27).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex31)| -v2491(VarCurr,bitIndex27).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex32).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex28)|v2491(VarCurr,bitIndex28).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex32)| -v2491(VarCurr,bitIndex28).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex33).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex29)|v2491(VarCurr,bitIndex29).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex33)| -v2491(VarCurr,bitIndex29).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex34).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex30)|v2491(VarCurr,bitIndex30).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex34)| -v2491(VarCurr,bitIndex30).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex35).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex31)|v2491(VarCurr,bitIndex31).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex35)| -v2491(VarCurr,bitIndex31).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex36).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex32)|v2491(VarCurr,bitIndex32).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex36)| -v2491(VarCurr,bitIndex32).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex37).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex33)|v2491(VarCurr,bitIndex33).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex37)| -v2491(VarCurr,bitIndex33).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex34)|v2465(VarCurr,bitIndex38).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex34)|v2491(VarCurr,bitIndex34).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex34)| -v2465(VarCurr,bitIndex38)| -v2491(VarCurr,bitIndex34).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex35)|v2465(VarCurr,bitIndex39).
% 94.61/93.97  0 [] -v2490(VarCurr,bitIndex35)|v2491(VarCurr,bitIndex35).
% 94.61/93.97  0 [] v2490(VarCurr,bitIndex35)| -v2465(VarCurr,bitIndex39)| -v2491(VarCurr,bitIndex35).
% 94.61/93.97  0 [] -v2491(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.61/93.97  0 [] v2491(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.61/93.97  0 [] -v2491(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.61/93.97  0 [] v2491(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.61/93.97  0 [] -v2491(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.61/93.97  0 [] v2491(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.61/93.97  0 [] -v2491(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.61/93.97  0 [] v2491(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.61/93.97  0 [] -v2491(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex30)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex30)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex31)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex31)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex32)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex32)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex33)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex33)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex34)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex34)| -v2468(VarCurr).
% 94.61/93.98  0 [] -v2491(VarCurr,bitIndex35)|v2468(VarCurr).
% 94.61/93.98  0 [] v2491(VarCurr,bitIndex35)| -v2468(VarCurr).
% 94.61/93.98  0 [] -range_39_0(B)| -v2460(VarCurr,B)|v2461(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)| -v2460(VarCurr,B)|v2483(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)|v2460(VarCurr,B)| -v2461(VarCurr,B)| -v2483(VarCurr,B).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex0)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex0)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex1)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex1)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex2)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex2)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex3)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex3)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex4)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex4)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex5)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex5)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex6)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex6)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex7)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex7)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex8)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex8)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex9)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex9)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex10)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex10)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex11)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex11)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex12)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex12)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex13)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex13)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex14)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex14)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex15)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex15)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex16)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex16)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex17)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex17)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex18)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex18)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex19)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex19)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex20)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex20)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex21)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex21)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex22)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex22)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex23)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex23)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex24)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex24)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex25)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex25)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex26)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex26)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex27)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex27)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex28)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex28)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex29)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex29)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex30)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex30)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex31)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex31)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex32)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex32)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex33)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex33)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex34)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex34)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex35)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex35)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex36)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex36)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex37)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex37)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex38)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex38)| -v2484(VarCurr).
% 94.61/93.98  0 [] -v2483(VarCurr,bitIndex39)|v2484(VarCurr).
% 94.61/93.98  0 [] v2483(VarCurr,bitIndex39)| -v2484(VarCurr).
% 94.61/93.98  0 [] v2484(VarCurr)|v2453(VarCurr,bitIndex2).
% 94.61/93.98  0 [] -v2484(VarCurr)| -v2453(VarCurr,bitIndex2).
% 94.61/93.98  0 [] -range_39_0(B)| -v2461(VarCurr,B)|v2462(VarCurr,B)|v2474(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)|v2461(VarCurr,B)| -v2462(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)|v2461(VarCurr,B)| -v2474(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)| -v2474(VarCurr,B)|v2475(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)| -v2474(VarCurr,B)|v2482(VarCurr,B).
% 94.61/93.98  0 [] -range_39_0(B)|v2474(VarCurr,B)| -v2475(VarCurr,B)| -v2482(VarCurr,B).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex38)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex38)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -v2482(VarCurr,bitIndex39)|v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] v2482(VarCurr,bitIndex39)| -v2453(VarCurr,bitIndex1).
% 94.61/93.98  0 [] -range_37_0(B)| -v2475(VarCurr,B)|v2476(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)|v2475(VarCurr,B)| -v2476(VarCurr,B).
% 94.61/93.98  0 [] -v2475(VarCurr,bitIndex39)|$F.
% 94.61/93.98  0 [] v2475(VarCurr,bitIndex39)| -$F.
% 94.61/93.98  0 [] -v2475(VarCurr,bitIndex38)|$F.
% 94.61/93.98  0 [] v2475(VarCurr,bitIndex38)| -$F.
% 94.61/93.98  0 [] -range_37_0(B)| -v2476(VarCurr,B)|v2477(VarCurr,B)|v2479(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)|v2476(VarCurr,B)| -v2477(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)|v2476(VarCurr,B)| -v2479(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)| -v2479(VarCurr,B)|v2480(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)| -v2479(VarCurr,B)|v2481(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)|v2479(VarCurr,B)| -v2480(VarCurr,B)| -v2481(VarCurr,B).
% 94.61/93.98  0 [] -range_37_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex0!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex1!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex2!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex3!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex4!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex5!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex6!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex7!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex8!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex9!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex10!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex11!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex12!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex13!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex14!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex15!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex16!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex17!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex18!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex19!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex20!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex21!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex22!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex23!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex24!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex25!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex26!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex27!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex28!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex29!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex30!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex31!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex32!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex33!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex34!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex35!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex36!=B.
% 94.61/93.98  0 [] range_37_0(B)|bitIndex37!=B.
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] -v2481(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.61/93.98  0 [] v2481(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2481(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2481(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex36)|v2465(VarCurr,bitIndex39).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex36)| -v2465(VarCurr,bitIndex39).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex35)|v2465(VarCurr,bitIndex38).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex35)| -v2465(VarCurr,bitIndex38).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex34)|v2465(VarCurr,bitIndex37).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex34)| -v2465(VarCurr,bitIndex37).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex36).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex36).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex35).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex35).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex34).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex34).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex33).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex33).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex32).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex32).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex31).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex31).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex30).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex30).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex29).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex29).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex28).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex28).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex27).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex27).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex26).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex26).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex25).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex25).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex24).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex24).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex23).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex23).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex22).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex22).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex21).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex21).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex20).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex20).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex19).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex19).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex18).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex18).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex17).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex17).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex16).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex16).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex15).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex15).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex14).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex14).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex13).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex13).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex12).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex12).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex11).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex11).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex10).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex10).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex9).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex9).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex8).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex8).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex7).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex7).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex6).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex6).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex5).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex5).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex4).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex4).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex3).
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex3).
% 94.61/93.99  0 [] -v2480(VarCurr,bitIndex37)|$F.
% 94.61/93.99  0 [] v2480(VarCurr,bitIndex37)| -$F.
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex2).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex0)|v2478(VarCurr,bitIndex0).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex2)| -v2478(VarCurr,bitIndex0).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex3).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex1)|v2478(VarCurr,bitIndex1).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex3)| -v2478(VarCurr,bitIndex1).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex4).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex2)|v2478(VarCurr,bitIndex2).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex4)| -v2478(VarCurr,bitIndex2).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex5).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex3)|v2478(VarCurr,bitIndex3).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex5)| -v2478(VarCurr,bitIndex3).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex6).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex4)|v2478(VarCurr,bitIndex4).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex6)| -v2478(VarCurr,bitIndex4).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex7).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex5)|v2478(VarCurr,bitIndex5).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex7)| -v2478(VarCurr,bitIndex5).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex8).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex6)|v2478(VarCurr,bitIndex6).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex8)| -v2478(VarCurr,bitIndex6).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex9).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex7)|v2478(VarCurr,bitIndex7).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex9)| -v2478(VarCurr,bitIndex7).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex10).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex8)|v2478(VarCurr,bitIndex8).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex10)| -v2478(VarCurr,bitIndex8).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex11).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex9)|v2478(VarCurr,bitIndex9).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex11)| -v2478(VarCurr,bitIndex9).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex12).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex10)|v2478(VarCurr,bitIndex10).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex12)| -v2478(VarCurr,bitIndex10).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex13).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex11)|v2478(VarCurr,bitIndex11).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex13)| -v2478(VarCurr,bitIndex11).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex14).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex12)|v2478(VarCurr,bitIndex12).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex14)| -v2478(VarCurr,bitIndex12).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex15).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex13)|v2478(VarCurr,bitIndex13).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex15)| -v2478(VarCurr,bitIndex13).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex16).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex14)|v2478(VarCurr,bitIndex14).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex16)| -v2478(VarCurr,bitIndex14).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex17).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex15)|v2478(VarCurr,bitIndex15).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex17)| -v2478(VarCurr,bitIndex15).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex18).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex16)|v2478(VarCurr,bitIndex16).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex18)| -v2478(VarCurr,bitIndex16).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex19).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex17)|v2478(VarCurr,bitIndex17).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex19)| -v2478(VarCurr,bitIndex17).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex20).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex18)|v2478(VarCurr,bitIndex18).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex20)| -v2478(VarCurr,bitIndex18).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex21).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex19)|v2478(VarCurr,bitIndex19).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex21)| -v2478(VarCurr,bitIndex19).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex22).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex20)|v2478(VarCurr,bitIndex20).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex22)| -v2478(VarCurr,bitIndex20).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex23).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex21)|v2478(VarCurr,bitIndex21).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex23)| -v2478(VarCurr,bitIndex21).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex24).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex22)|v2478(VarCurr,bitIndex22).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex24)| -v2478(VarCurr,bitIndex22).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex25).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex23)|v2478(VarCurr,bitIndex23).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex25)| -v2478(VarCurr,bitIndex23).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex26).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex24)|v2478(VarCurr,bitIndex24).
% 94.61/93.99  0 [] v2477(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex26)| -v2478(VarCurr,bitIndex24).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex27).
% 94.61/93.99  0 [] -v2477(VarCurr,bitIndex25)|v2478(VarCurr,bitIndex25).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex27)| -v2478(VarCurr,bitIndex25).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex28).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex26)|v2478(VarCurr,bitIndex26).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex28)| -v2478(VarCurr,bitIndex26).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex29).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex27)|v2478(VarCurr,bitIndex27).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex29)| -v2478(VarCurr,bitIndex27).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex30).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex28)|v2478(VarCurr,bitIndex28).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex30)| -v2478(VarCurr,bitIndex28).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex31).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex29)|v2478(VarCurr,bitIndex29).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex31)| -v2478(VarCurr,bitIndex29).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex32).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex30)|v2478(VarCurr,bitIndex30).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex32)| -v2478(VarCurr,bitIndex30).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex33).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex31)|v2478(VarCurr,bitIndex31).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex33)| -v2478(VarCurr,bitIndex31).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex34).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex32)|v2478(VarCurr,bitIndex32).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex34)| -v2478(VarCurr,bitIndex32).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex35).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex33)|v2478(VarCurr,bitIndex33).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex35)| -v2478(VarCurr,bitIndex33).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex34)|v2465(VarCurr,bitIndex36).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex34)|v2478(VarCurr,bitIndex34).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex34)| -v2465(VarCurr,bitIndex36)| -v2478(VarCurr,bitIndex34).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex35)|v2465(VarCurr,bitIndex37).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex35)|v2478(VarCurr,bitIndex35).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex35)| -v2465(VarCurr,bitIndex37)| -v2478(VarCurr,bitIndex35).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex36)|v2465(VarCurr,bitIndex38).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex36)|v2478(VarCurr,bitIndex36).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex36)| -v2465(VarCurr,bitIndex38)| -v2478(VarCurr,bitIndex36).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex37)|v2465(VarCurr,bitIndex39).
% 94.61/94.00  0 [] -v2477(VarCurr,bitIndex37)|v2478(VarCurr,bitIndex37).
% 94.61/94.00  0 [] v2477(VarCurr,bitIndex37)| -v2465(VarCurr,bitIndex39)| -v2478(VarCurr,bitIndex37).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex30)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex30)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex31)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex31)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex32)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex32)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex33)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex33)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex34)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex34)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex35)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex35)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex36)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex36)| -v2468(VarCurr).
% 94.61/94.00  0 [] -v2478(VarCurr,bitIndex37)|v2468(VarCurr).
% 94.61/94.00  0 [] v2478(VarCurr,bitIndex37)| -v2468(VarCurr).
% 94.61/94.00  0 [] -range_39_0(B)| -v2462(VarCurr,B)|v2463(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)| -v2462(VarCurr,B)|v2472(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)|v2462(VarCurr,B)| -v2463(VarCurr,B)| -v2472(VarCurr,B).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex0)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex0)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex1)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex1)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex2)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex2)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex3)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex3)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex4)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex4)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex5)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex5)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex6)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex6)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex7)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex7)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex8)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex8)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex9)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex9)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex10)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex10)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex11)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex11)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex12)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex12)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex13)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex13)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex14)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex14)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex15)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex15)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex16)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex16)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex17)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex17)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex18)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex18)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex19)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex19)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex20)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex20)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex21)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex21)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex22)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex22)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex23)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex23)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex24)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex24)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex25)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex25)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex26)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex26)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex27)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex27)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex28)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex28)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex29)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex29)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex30)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex30)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex31)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex31)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex32)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex32)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex33)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex33)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex34)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex34)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex35)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex35)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex36)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex36)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex37)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex37)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex38)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex38)| -v2473(VarCurr).
% 94.61/94.00  0 [] -v2472(VarCurr,bitIndex39)|v2473(VarCurr).
% 94.61/94.00  0 [] v2472(VarCurr,bitIndex39)| -v2473(VarCurr).
% 94.61/94.00  0 [] v2473(VarCurr)|v2453(VarCurr,bitIndex1).
% 94.61/94.00  0 [] -v2473(VarCurr)| -v2453(VarCurr,bitIndex1).
% 94.61/94.00  0 [] -range_39_0(B)| -v2463(VarCurr,B)|v2464(VarCurr,B)|v2469(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)|v2463(VarCurr,B)| -v2464(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)|v2463(VarCurr,B)| -v2469(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)| -v2469(VarCurr,B)|v2470(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)| -v2469(VarCurr,B)|v2471(VarCurr,B).
% 94.61/94.00  0 [] -range_39_0(B)|v2469(VarCurr,B)| -v2470(VarCurr,B)| -v2471(VarCurr,B).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex0)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex0)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex1)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex1)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex2)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex2)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex3)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex3)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex4)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex4)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex5)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex5)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex6)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex6)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex7)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex7)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex8)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex8)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex9)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex9)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex10)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex10)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex11)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex11)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex12)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex12)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex13)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex13)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex14)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex14)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex15)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex15)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex16)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex16)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex17)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex17)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex18)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex18)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex19)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex19)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex20)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex20)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex21)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex21)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex22)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex22)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex23)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex23)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex24)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex24)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex25)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex25)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex26)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex26)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex27)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex27)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex28)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex28)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex29)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex29)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex30)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex30)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex31)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex31)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex32)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex32)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex33)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex33)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex34)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex34)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex35)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex35)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex36)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex36)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex37)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex37)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex38)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex38)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2471(VarCurr,bitIndex39)|v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] v2471(VarCurr,bitIndex39)| -v2453(VarCurr,bitIndex0).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex38)|v2465(VarCurr,bitIndex39).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex38)| -v2465(VarCurr,bitIndex39).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex37)|v2465(VarCurr,bitIndex38).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex37)| -v2465(VarCurr,bitIndex38).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex36)|v2465(VarCurr,bitIndex37).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex36)| -v2465(VarCurr,bitIndex37).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex35)|v2465(VarCurr,bitIndex36).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex35)| -v2465(VarCurr,bitIndex36).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex34)|v2465(VarCurr,bitIndex35).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex34)| -v2465(VarCurr,bitIndex35).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex33)|v2465(VarCurr,bitIndex34).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex33)| -v2465(VarCurr,bitIndex34).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex32)|v2465(VarCurr,bitIndex33).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex32)| -v2465(VarCurr,bitIndex33).
% 94.61/94.00  0 [] -v2470(VarCurr,bitIndex31)|v2465(VarCurr,bitIndex32).
% 94.61/94.00  0 [] v2470(VarCurr,bitIndex31)| -v2465(VarCurr,bitIndex32).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex30)|v2465(VarCurr,bitIndex31).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex30)| -v2465(VarCurr,bitIndex31).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex29)|v2465(VarCurr,bitIndex30).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex29)| -v2465(VarCurr,bitIndex30).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex28)|v2465(VarCurr,bitIndex29).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex28)| -v2465(VarCurr,bitIndex29).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex27)|v2465(VarCurr,bitIndex28).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex27)| -v2465(VarCurr,bitIndex28).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex26)|v2465(VarCurr,bitIndex27).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex26)| -v2465(VarCurr,bitIndex27).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex25)|v2465(VarCurr,bitIndex26).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex25)| -v2465(VarCurr,bitIndex26).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex24)|v2465(VarCurr,bitIndex25).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex24)| -v2465(VarCurr,bitIndex25).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex23)|v2465(VarCurr,bitIndex24).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex23)| -v2465(VarCurr,bitIndex24).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex22)|v2465(VarCurr,bitIndex23).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex22)| -v2465(VarCurr,bitIndex23).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex21)|v2465(VarCurr,bitIndex22).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex21)| -v2465(VarCurr,bitIndex22).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex20)|v2465(VarCurr,bitIndex21).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex20)| -v2465(VarCurr,bitIndex21).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex19)|v2465(VarCurr,bitIndex20).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex19)| -v2465(VarCurr,bitIndex20).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex18)|v2465(VarCurr,bitIndex19).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex18)| -v2465(VarCurr,bitIndex19).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex17)|v2465(VarCurr,bitIndex18).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex17)| -v2465(VarCurr,bitIndex18).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex16)|v2465(VarCurr,bitIndex17).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex16)| -v2465(VarCurr,bitIndex17).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex15)|v2465(VarCurr,bitIndex16).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex15)| -v2465(VarCurr,bitIndex16).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex14)|v2465(VarCurr,bitIndex15).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex14)| -v2465(VarCurr,bitIndex15).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex13)|v2465(VarCurr,bitIndex14).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex13)| -v2465(VarCurr,bitIndex14).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex12)|v2465(VarCurr,bitIndex13).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex12)| -v2465(VarCurr,bitIndex13).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex11)|v2465(VarCurr,bitIndex12).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex11)| -v2465(VarCurr,bitIndex12).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex10)|v2465(VarCurr,bitIndex11).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex10)| -v2465(VarCurr,bitIndex11).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex9)|v2465(VarCurr,bitIndex10).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex9)| -v2465(VarCurr,bitIndex10).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex8)|v2465(VarCurr,bitIndex9).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex8)| -v2465(VarCurr,bitIndex9).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex7)|v2465(VarCurr,bitIndex8).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex7)| -v2465(VarCurr,bitIndex8).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex6)|v2465(VarCurr,bitIndex7).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex6)| -v2465(VarCurr,bitIndex7).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex5)|v2465(VarCurr,bitIndex6).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex5)| -v2465(VarCurr,bitIndex6).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex4)|v2465(VarCurr,bitIndex5).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex4)| -v2465(VarCurr,bitIndex5).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex3)|v2465(VarCurr,bitIndex4).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex3)| -v2465(VarCurr,bitIndex4).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex2)|v2465(VarCurr,bitIndex3).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex2)| -v2465(VarCurr,bitIndex3).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex1)|v2465(VarCurr,bitIndex2).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex1)| -v2465(VarCurr,bitIndex2).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex0)|v2465(VarCurr,bitIndex1).
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex0)| -v2465(VarCurr,bitIndex1).
% 94.66/94.01  0 [] -v2470(VarCurr,bitIndex39)|$F.
% 94.66/94.01  0 [] v2470(VarCurr,bitIndex39)| -$F.
% 94.66/94.01  0 [] -range_39_0(B)| -v2464(VarCurr,B)|v2465(VarCurr,B).
% 94.66/94.01  0 [] -range_39_0(B)| -v2464(VarCurr,B)|v2467(VarCurr,B).
% 94.66/94.01  0 [] -range_39_0(B)|v2464(VarCurr,B)| -v2465(VarCurr,B)| -v2467(VarCurr,B).
% 94.66/94.01  0 [] -range_39_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex0!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex1!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex2!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex3!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex4!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex5!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex6!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex7!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex8!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex9!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex10!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex11!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex12!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex13!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex14!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex15!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex16!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex17!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex18!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex19!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex20!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex21!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex22!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex23!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex24!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex25!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex26!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex27!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex28!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex29!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex30!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex31!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex32!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex33!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex34!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex35!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex36!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex37!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex38!=B.
% 94.66/94.01  0 [] range_39_0(B)|bitIndex39!=B.
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex0)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex0)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex1)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex1)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex2)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex2)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex3)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex3)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex4)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex4)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex5)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex5)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex6)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex6)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex7)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex7)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex8)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex8)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex9)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex9)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex10)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex10)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex11)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex11)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex12)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex12)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex13)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex13)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex14)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex14)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex15)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex15)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex16)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex16)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex17)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex17)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex18)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex18)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex19)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex19)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex20)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex20)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex21)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex21)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex22)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex22)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex23)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex23)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex24)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex24)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex25)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex25)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex26)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex26)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex27)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex27)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex28)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex28)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex29)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex29)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex30)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex30)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex31)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex31)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex32)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex32)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex33)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex33)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex34)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex34)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex35)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex35)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex36)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex36)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex37)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex37)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex38)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex38)| -v2468(VarCurr).
% 94.66/94.01  0 [] -v2467(VarCurr,bitIndex39)|v2468(VarCurr).
% 94.66/94.01  0 [] v2467(VarCurr,bitIndex39)| -v2468(VarCurr).
% 94.66/94.01  0 [] v2468(VarCurr)|v2453(VarCurr,bitIndex0).
% 94.66/94.01  0 [] -v2468(VarCurr)| -v2453(VarCurr,bitIndex0).
% 94.66/94.01  0 [] -range_4_0(B)| -v2453(constB0,B)|$F.
% 94.66/94.01  0 [] -range_4_0(B)|v2453(constB0,B)| -$F.
% 94.66/94.01  0 [] -range_31_0(B)| -v2465(VarCurr,B)|v2451(VarCurr,B).
% 94.66/94.01  0 [] -range_31_0(B)|v2465(VarCurr,B)| -v2451(VarCurr,B).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex39)|v2451(VarCurr,bitIndex7).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex39)| -v2451(VarCurr,bitIndex7).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex38)|v2451(VarCurr,bitIndex6).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex38)| -v2451(VarCurr,bitIndex6).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex37)|v2451(VarCurr,bitIndex5).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex37)| -v2451(VarCurr,bitIndex5).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex36)|v2451(VarCurr,bitIndex4).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex36)| -v2451(VarCurr,bitIndex4).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex35)|v2451(VarCurr,bitIndex3).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex35)| -v2451(VarCurr,bitIndex3).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex34)|v2451(VarCurr,bitIndex2).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex34)| -v2451(VarCurr,bitIndex2).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex33)|v2451(VarCurr,bitIndex1).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex33)| -v2451(VarCurr,bitIndex1).
% 94.66/94.01  0 [] -v2465(VarCurr,bitIndex32)|v2451(VarCurr,bitIndex0).
% 94.66/94.01  0 [] v2465(VarCurr,bitIndex32)| -v2451(VarCurr,bitIndex0).
% 94.66/94.01  0 [] -range_31_0(B)| -v2451(constB0,B)|$T.
% 94.66/94.01  0 [] -range_31_0(B)|v2451(constB0,B)| -$T.
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex31).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex30).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex29).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex28).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex27).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex26).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex25).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex24).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex23).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex22).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex21).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex20).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex19).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex18).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex17).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex16).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex15).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex14).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex13).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex12).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex11).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex10).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex9).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex8).
% 94.66/94.01  0 [] b11111111111111111111111111111111(bitIndex7).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex6).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex5).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex4).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex3).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex2).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex1).
% 94.66/94.02  0 [] b11111111111111111111111111111111(bitIndex0).
% 94.66/94.02  0 [] -v2426(VarCurr)|v2437(VarCurr).
% 94.66/94.02  0 [] -v2426(VarCurr)|v2439(VarCurr).
% 94.66/94.02  0 [] v2426(VarCurr)| -v2437(VarCurr)| -v2439(VarCurr).
% 94.66/94.02  0 [] v2439(VarCurr)|v2322(VarCurr).
% 94.66/94.02  0 [] -v2439(VarCurr)| -v2322(VarCurr).
% 94.66/94.02  0 [] -v2437(VarCurr)|v2438(VarCurr).
% 94.66/94.02  0 [] -v2437(VarCurr)|v2365(VarCurr).
% 94.66/94.02  0 [] v2437(VarCurr)| -v2438(VarCurr)| -v2365(VarCurr).
% 94.66/94.02  0 [] -v2438(VarCurr)|v2304(VarCurr).
% 94.66/94.02  0 [] -v2438(VarCurr)|v2428(VarCurr).
% 94.66/94.02  0 [] v2438(VarCurr)| -v2304(VarCurr)| -v2428(VarCurr).
% 94.66/94.02  0 [] -v2428(VarCurr)|v2430(VarCurr).
% 94.66/94.02  0 [] v2428(VarCurr)| -v2430(VarCurr).
% 94.66/94.02  0 [] -v2430(VarCurr)|v2432(VarCurr).
% 94.66/94.02  0 [] v2430(VarCurr)| -v2432(VarCurr).
% 94.66/94.02  0 [] v2434(VarCurr)| -v2432(VarCurr)|$F.
% 94.66/94.02  0 [] v2434(VarCurr)|v2432(VarCurr)| -$F.
% 94.66/94.02  0 [] -v2434(VarCurr)| -v2432(VarCurr)|$T.
% 94.66/94.02  0 [] -v2434(VarCurr)|v2432(VarCurr)| -$T.
% 94.66/94.02  0 [] -v2434(VarCurr)|v2435(VarCurr).
% 94.66/94.02  0 [] -v2434(VarCurr)|v170(VarCurr).
% 94.66/94.02  0 [] v2434(VarCurr)| -v2435(VarCurr)| -v170(VarCurr).
% 94.66/94.02  0 [] v2435(VarCurr)|v145(VarCurr,bitIndex0).
% 94.66/94.02  0 [] -v2435(VarCurr)| -v145(VarCurr,bitIndex0).
% 94.66/94.02  0 [] -v2422(VarCurr)|v2424(VarCurr).
% 94.66/94.02  0 [] v2422(VarCurr)| -v2424(VarCurr).
% 94.66/94.02  0 [] -v2424(VarCurr)|v149(VarCurr,bitIndex53).
% 94.66/94.02  0 [] v2424(VarCurr)| -v149(VarCurr,bitIndex53).
% 94.66/94.02  0 [] -v149(VarCurr,bitIndex53)|v151(VarCurr,bitIndex53).
% 94.66/94.02  0 [] v149(VarCurr,bitIndex53)| -v151(VarCurr,bitIndex53).
% 94.66/94.02  0 [] -v151(VarCurr,bitIndex53)|v156(VarCurr,bitIndex53).
% 94.66/94.02  0 [] v151(VarCurr,bitIndex53)| -v156(VarCurr,bitIndex53).
% 94.66/94.02  0 [] -v2412(VarCurr)|v2414(VarCurr).
% 94.66/94.02  0 [] -v2412(VarCurr)|v2416(VarCurr).
% 94.66/94.02  0 [] v2412(VarCurr)| -v2414(VarCurr)| -v2416(VarCurr).
% 94.66/94.02  0 [] v2416(VarCurr)|v2322(VarCurr).
% 94.66/94.02  0 [] -v2416(VarCurr)| -v2322(VarCurr).
% 94.66/94.02  0 [] -v2414(VarCurr)|v2306(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2414(VarCurr)| -v2306(VarCurr,bitIndex0).
% 94.66/94.02  0 [] -v2306(VarCurr,bitIndex0)|v2394(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2306(VarCurr,bitIndex0)| -v2394(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2409(VarCurr)| -v2302(VarCurr,bitIndex9)|$F.
% 94.66/94.02  0 [] v2409(VarCurr)|v2302(VarCurr,bitIndex9)| -$F.
% 94.66/94.02  0 [] -v2409(VarCurr)| -v2302(VarCurr,bitIndex9)|$T.
% 94.66/94.02  0 [] -v2409(VarCurr)|v2302(VarCurr,bitIndex9)| -$T.
% 94.66/94.02  0 [] -v2409(VarCurr)|v2304(VarCurr).
% 94.66/94.02  0 [] -v2409(VarCurr)|v2410(VarCurr).
% 94.66/94.02  0 [] v2409(VarCurr)| -v2304(VarCurr)| -v2410(VarCurr).
% 94.66/94.02  0 [] -v2410(VarCurr)| -$T|v2397(VarCurr,bitIndex8).
% 94.66/94.02  0 [] -v2410(VarCurr)|$T| -v2397(VarCurr,bitIndex8).
% 94.66/94.02  0 [] v2410(VarCurr)|$T|v2397(VarCurr,bitIndex8).
% 94.66/94.02  0 [] v2410(VarCurr)| -$T| -v2397(VarCurr,bitIndex8).
% 94.66/94.02  0 [] v2406(VarCurr)| -v2302(VarCurr,bitIndex6)|$F.
% 94.66/94.02  0 [] v2406(VarCurr)|v2302(VarCurr,bitIndex6)| -$F.
% 94.66/94.02  0 [] -v2406(VarCurr)| -v2302(VarCurr,bitIndex6)|$T.
% 94.66/94.02  0 [] -v2406(VarCurr)|v2302(VarCurr,bitIndex6)| -$T.
% 94.66/94.02  0 [] -v2406(VarCurr)|v2304(VarCurr).
% 94.66/94.02  0 [] -v2406(VarCurr)|v2407(VarCurr).
% 94.66/94.02  0 [] v2406(VarCurr)| -v2304(VarCurr)| -v2407(VarCurr).
% 94.66/94.02  0 [] -v2407(VarCurr)| -$T|v2397(VarCurr,bitIndex5).
% 94.66/94.02  0 [] -v2407(VarCurr)|$T| -v2397(VarCurr,bitIndex5).
% 94.66/94.02  0 [] v2407(VarCurr)|$T|v2397(VarCurr,bitIndex5).
% 94.66/94.02  0 [] v2407(VarCurr)| -$T| -v2397(VarCurr,bitIndex5).
% 94.66/94.02  0 [] v2399(VarCurr)| -v2302(VarCurr,bitIndex3)|$F.
% 94.66/94.02  0 [] v2399(VarCurr)|v2302(VarCurr,bitIndex3)| -$F.
% 94.66/94.02  0 [] -v2399(VarCurr)| -v2302(VarCurr,bitIndex3)|$T.
% 94.66/94.02  0 [] -v2399(VarCurr)|v2302(VarCurr,bitIndex3)| -$T.
% 94.66/94.02  0 [] -v2399(VarCurr)|v2400(VarCurr).
% 94.66/94.02  0 [] -v2399(VarCurr)|v2402(VarCurr).
% 94.66/94.02  0 [] v2399(VarCurr)| -v2400(VarCurr)| -v2402(VarCurr).
% 94.66/94.02  0 [] -v2402(VarCurr)| -$T|v2397(VarCurr,bitIndex2).
% 94.66/94.02  0 [] -v2402(VarCurr)|$T| -v2397(VarCurr,bitIndex2).
% 94.66/94.02  0 [] v2402(VarCurr)|$T|v2397(VarCurr,bitIndex2).
% 94.66/94.02  0 [] v2402(VarCurr)| -$T| -v2397(VarCurr,bitIndex2).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex11).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex10).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex8).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex7).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex5).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex4).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex2).
% 94.66/94.02  0 [] -v2397(constB0,bitIndex1).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex10).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex9).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex7).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex6).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex4).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex3).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex1).
% 94.66/94.02  0 [] -bx00x00x00x00(bitIndex0).
% 94.66/94.02  0 [] -v2400(VarCurr)|v2365(VarCurr).
% 94.66/94.02  0 [] -v2400(VarCurr)|v2304(VarCurr).
% 94.66/94.02  0 [] v2400(VarCurr)| -v2365(VarCurr)| -v2304(VarCurr).
% 94.66/94.02  0 [] -v2304(VarCurr)|v2306(VarCurr,bitIndex1).
% 94.66/94.02  0 [] v2304(VarCurr)| -v2306(VarCurr,bitIndex1).
% 94.66/94.02  0 [] -v2306(VarCurr,bitIndex1)|v2394(VarCurr,bitIndex1).
% 94.66/94.02  0 [] v2306(VarCurr,bitIndex1)| -v2394(VarCurr,bitIndex1).
% 94.66/94.02  0 [] -range_1_0(B)| -v2394(VarCurr,B)|v2308(VarCurr,B).
% 94.66/94.02  0 [] -range_1_0(B)| -v2394(VarCurr,B)|v2395(VarCurr,B).
% 94.66/94.02  0 [] -range_1_0(B)|v2394(VarCurr,B)| -v2308(VarCurr,B)| -v2395(VarCurr,B).
% 94.66/94.02  0 [] -range_1_0(B)| -v2395(VarCurr,B)| -v2338(VarCurr,B).
% 94.66/94.02  0 [] -range_1_0(B)|v2395(VarCurr,B)|v2338(VarCurr,B).
% 94.66/94.02  0 [] -v2338(VarCurr,bitIndex1)|v2338(VarCurr,bitIndex0)|v2308(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2338(VarCurr,bitIndex1)| -v2338(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2338(VarCurr,bitIndex1)| -v2308(VarCurr,bitIndex0).
% 94.66/94.02  0 [] -v2308(VarCurr,bitIndex0)|v2335(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2308(VarCurr,bitIndex0)| -v2335(VarCurr,bitIndex0).
% 94.66/94.02  0 [] -v2336(VarCurr)|v2390(VarCurr).
% 94.66/94.02  0 [] -v2336(VarCurr)|v2392(VarCurr).
% 94.66/94.02  0 [] v2336(VarCurr)| -v2390(VarCurr)| -v2392(VarCurr).
% 94.66/94.02  0 [] v2392(VarCurr)|v2312(VarCurr).
% 94.66/94.02  0 [] -v2392(VarCurr)| -v2312(VarCurr).
% 94.66/94.02  0 [] -v2390(VarCurr)|v2391(VarCurr).
% 94.66/94.02  0 [] -v2390(VarCurr)|v2365(VarCurr).
% 94.66/94.02  0 [] v2390(VarCurr)| -v2391(VarCurr)| -v2365(VarCurr).
% 94.66/94.02  0 [] -v2391(VarCurr)|v2341(VarCurr)|v2363(VarCurr).
% 94.66/94.02  0 [] -v2391(VarCurr)| -v2341(VarCurr)| -v2363(VarCurr).
% 94.66/94.02  0 [] v2391(VarCurr)| -v2341(VarCurr)|v2363(VarCurr).
% 94.66/94.02  0 [] v2391(VarCurr)|v2341(VarCurr)| -v2363(VarCurr).
% 94.66/94.02  0 [] -v2365(VarCurr)|v2367(VarCurr).
% 94.66/94.02  0 [] v2365(VarCurr)| -v2367(VarCurr).
% 94.66/94.02  0 [] -v2367(VarCurr)|v2369(VarCurr).
% 94.66/94.02  0 [] v2367(VarCurr)| -v2369(VarCurr).
% 94.66/94.02  0 [] -v2369(VarCurr)|v2374(VarCurr)|v2371(VarCurr,bitIndex15).
% 94.66/94.02  0 [] v2369(VarCurr)| -v2374(VarCurr).
% 94.66/94.02  0 [] v2369(VarCurr)| -v2371(VarCurr,bitIndex15).
% 94.66/94.02  0 [] -v2374(VarCurr)|v2375(VarCurr)|v2371(VarCurr,bitIndex14).
% 94.66/94.02  0 [] v2374(VarCurr)| -v2375(VarCurr).
% 94.66/94.02  0 [] v2374(VarCurr)| -v2371(VarCurr,bitIndex14).
% 94.66/94.02  0 [] -v2375(VarCurr)|v2376(VarCurr)|v2371(VarCurr,bitIndex13).
% 94.66/94.02  0 [] v2375(VarCurr)| -v2376(VarCurr).
% 94.66/94.02  0 [] v2375(VarCurr)| -v2371(VarCurr,bitIndex13).
% 94.66/94.02  0 [] -v2376(VarCurr)|v2377(VarCurr)|v2371(VarCurr,bitIndex12).
% 94.66/94.02  0 [] v2376(VarCurr)| -v2377(VarCurr).
% 94.66/94.02  0 [] v2376(VarCurr)| -v2371(VarCurr,bitIndex12).
% 94.66/94.02  0 [] -v2377(VarCurr)|v2378(VarCurr)|v2371(VarCurr,bitIndex11).
% 94.66/94.02  0 [] v2377(VarCurr)| -v2378(VarCurr).
% 94.66/94.02  0 [] v2377(VarCurr)| -v2371(VarCurr,bitIndex11).
% 94.66/94.02  0 [] -v2378(VarCurr)|v2379(VarCurr)|v2371(VarCurr,bitIndex10).
% 94.66/94.02  0 [] v2378(VarCurr)| -v2379(VarCurr).
% 94.66/94.02  0 [] v2378(VarCurr)| -v2371(VarCurr,bitIndex10).
% 94.66/94.02  0 [] -v2379(VarCurr)|v2380(VarCurr)|v2371(VarCurr,bitIndex9).
% 94.66/94.02  0 [] v2379(VarCurr)| -v2380(VarCurr).
% 94.66/94.02  0 [] v2379(VarCurr)| -v2371(VarCurr,bitIndex9).
% 94.66/94.02  0 [] -v2380(VarCurr)|v2381(VarCurr)|v2371(VarCurr,bitIndex8).
% 94.66/94.02  0 [] v2380(VarCurr)| -v2381(VarCurr).
% 94.66/94.02  0 [] v2380(VarCurr)| -v2371(VarCurr,bitIndex8).
% 94.66/94.02  0 [] -v2381(VarCurr)|v2382(VarCurr)|v2371(VarCurr,bitIndex7).
% 94.66/94.02  0 [] v2381(VarCurr)| -v2382(VarCurr).
% 94.66/94.02  0 [] v2381(VarCurr)| -v2371(VarCurr,bitIndex7).
% 94.66/94.02  0 [] -v2382(VarCurr)|v2383(VarCurr)|v2371(VarCurr,bitIndex6).
% 94.66/94.02  0 [] v2382(VarCurr)| -v2383(VarCurr).
% 94.66/94.02  0 [] v2382(VarCurr)| -v2371(VarCurr,bitIndex6).
% 94.66/94.02  0 [] -v2383(VarCurr)|v2384(VarCurr)|v2371(VarCurr,bitIndex5).
% 94.66/94.02  0 [] v2383(VarCurr)| -v2384(VarCurr).
% 94.66/94.02  0 [] v2383(VarCurr)| -v2371(VarCurr,bitIndex5).
% 94.66/94.02  0 [] -v2384(VarCurr)|v2385(VarCurr)|v2371(VarCurr,bitIndex4).
% 94.66/94.02  0 [] v2384(VarCurr)| -v2385(VarCurr).
% 94.66/94.02  0 [] v2384(VarCurr)| -v2371(VarCurr,bitIndex4).
% 94.66/94.02  0 [] -v2385(VarCurr)|v2386(VarCurr)|v2371(VarCurr,bitIndex3).
% 94.66/94.02  0 [] v2385(VarCurr)| -v2386(VarCurr).
% 94.66/94.02  0 [] v2385(VarCurr)| -v2371(VarCurr,bitIndex3).
% 94.66/94.02  0 [] -v2386(VarCurr)|v2387(VarCurr)|v2371(VarCurr,bitIndex2).
% 94.66/94.02  0 [] v2386(VarCurr)| -v2387(VarCurr).
% 94.66/94.02  0 [] v2386(VarCurr)| -v2371(VarCurr,bitIndex2).
% 94.66/94.02  0 [] -v2387(VarCurr)|v2371(VarCurr,bitIndex0)|v2371(VarCurr,bitIndex1).
% 94.66/94.02  0 [] v2387(VarCurr)| -v2371(VarCurr,bitIndex0).
% 94.66/94.02  0 [] v2387(VarCurr)| -v2371(VarCurr,bitIndex1).
% 94.66/94.03  0 [] -range_15_0(B)| -v2371(constB0,B)|$T.
% 94.66/94.03  0 [] -range_15_0(B)|v2371(constB0,B)| -$T.
% 94.66/94.03  0 [] b1111111111111111(bitIndex15).
% 94.66/94.03  0 [] b1111111111111111(bitIndex14).
% 94.66/94.03  0 [] b1111111111111111(bitIndex13).
% 94.66/94.03  0 [] b1111111111111111(bitIndex12).
% 94.66/94.03  0 [] b1111111111111111(bitIndex11).
% 94.66/94.03  0 [] b1111111111111111(bitIndex10).
% 94.66/94.03  0 [] b1111111111111111(bitIndex9).
% 94.66/94.03  0 [] b1111111111111111(bitIndex8).
% 94.66/94.03  0 [] b1111111111111111(bitIndex7).
% 94.66/94.03  0 [] b1111111111111111(bitIndex6).
% 94.66/94.03  0 [] b1111111111111111(bitIndex5).
% 94.66/94.03  0 [] b1111111111111111(bitIndex4).
% 94.66/94.03  0 [] b1111111111111111(bitIndex3).
% 94.66/94.03  0 [] b1111111111111111(bitIndex2).
% 94.66/94.03  0 [] b1111111111111111(bitIndex1).
% 94.66/94.03  0 [] b1111111111111111(bitIndex0).
% 94.66/94.03  0 [] -v2341(VarCurr)|v2343(VarCurr).
% 94.66/94.03  0 [] v2341(VarCurr)| -v2343(VarCurr).
% 94.66/94.03  0 [] -v2343(VarCurr)|v2345(VarCurr).
% 94.66/94.03  0 [] v2343(VarCurr)| -v2345(VarCurr).
% 94.66/94.03  0 [] -v2345(VarCurr)|v2347(VarCurr).
% 94.66/94.03  0 [] v2345(VarCurr)| -v2347(VarCurr).
% 94.66/94.03  0 [] -v2347(VarCurr)|v2349(VarCurr).
% 94.66/94.03  0 [] v2347(VarCurr)| -v2349(VarCurr).
% 94.66/94.03  0 [] -v2349(VarCurr)|v2351(VarCurr).
% 94.66/94.03  0 [] v2349(VarCurr)| -v2351(VarCurr).
% 94.66/94.03  0 [] -v2351(VarCurr)|v2353(VarCurr).
% 94.66/94.03  0 [] v2351(VarCurr)| -v2353(VarCurr).
% 94.66/94.03  0 [] -v2353(VarCurr)|v2355(VarCurr).
% 94.66/94.03  0 [] v2353(VarCurr)| -v2355(VarCurr).
% 94.66/94.03  0 [] -v2355(VarCurr)|v2357(VarCurr).
% 94.66/94.03  0 [] v2355(VarCurr)| -v2357(VarCurr).
% 94.66/94.03  0 [] -v2357(VarCurr)|v2359(VarCurr).
% 94.66/94.03  0 [] v2357(VarCurr)| -v2359(VarCurr).
% 94.66/94.03  0 [] -v2359(VarCurr)|v2361(VarCurr).
% 94.66/94.03  0 [] v2359(VarCurr)| -v2361(VarCurr).
% 94.66/94.03  0 [] -v2361(constB0)|$F.
% 94.66/94.03  0 [] v2361(constB0)| -$F.
% 94.66/94.03  0 [] -v2338(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] v2338(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] -v2308(VarCurr,bitIndex1)|v2335(VarCurr,bitIndex1).
% 94.66/94.03  0 [] v2308(VarCurr,bitIndex1)| -v2335(VarCurr,bitIndex1).
% 94.66/94.03  0 [] -v2335(VarCurr,bitIndex0)|v2336(VarCurr).
% 94.66/94.03  0 [] v2335(VarCurr,bitIndex0)| -v2336(VarCurr).
% 94.66/94.03  0 [] -v2335(VarCurr,bitIndex1)|v2310(VarCurr).
% 94.66/94.03  0 [] v2335(VarCurr,bitIndex1)| -v2310(VarCurr).
% 94.66/94.03  0 [] -v2310(VarCurr)|v2331(VarCurr).
% 94.66/94.03  0 [] -v2310(VarCurr)|v2334(VarCurr).
% 94.66/94.03  0 [] v2310(VarCurr)| -v2331(VarCurr)| -v2334(VarCurr).
% 94.66/94.03  0 [] v2334(VarCurr)|v2320(VarCurr).
% 94.66/94.03  0 [] -v2334(VarCurr)| -v2320(VarCurr).
% 94.66/94.03  0 [] -v2331(VarCurr)|v2332(VarCurr).
% 94.66/94.03  0 [] -v2331(VarCurr)|v2333(VarCurr).
% 94.66/94.03  0 [] v2331(VarCurr)| -v2332(VarCurr)| -v2333(VarCurr).
% 94.66/94.03  0 [] v2333(VarCurr)|v2312(VarCurr).
% 94.66/94.03  0 [] -v2333(VarCurr)| -v2312(VarCurr).
% 94.66/94.03  0 [] v2332(VarCurr)|v129(VarCurr).
% 94.66/94.03  0 [] -v2332(VarCurr)| -v129(VarCurr).
% 94.66/94.03  0 [] -v2320(VarCurr)|v2328(VarCurr)|v2326(VarCurr).
% 94.66/94.03  0 [] v2320(VarCurr)| -v2328(VarCurr).
% 94.66/94.03  0 [] v2320(VarCurr)| -v2326(VarCurr).
% 94.66/94.03  0 [] -v2326(constB0)|$F.
% 94.66/94.03  0 [] v2326(constB0)| -$F.
% 94.66/94.03  0 [] -v2328(VarCurr)|v2322(VarCurr).
% 94.66/94.03  0 [] -v2328(VarCurr)|v2329(VarCurr).
% 94.66/94.03  0 [] v2328(VarCurr)| -v2322(VarCurr)| -v2329(VarCurr).
% 94.66/94.03  0 [] v2329(VarCurr)|v2324(VarCurr).
% 94.66/94.03  0 [] -v2329(VarCurr)| -v2324(VarCurr).
% 94.66/94.03  0 [] -v2324(constB0)|$F.
% 94.66/94.03  0 [] v2324(constB0)| -$F.
% 94.66/94.03  0 [] -v2322(constB0)|$F.
% 94.66/94.03  0 [] v2322(constB0)| -$F.
% 94.66/94.03  0 [] -v2312(VarCurr)|v2314(VarCurr).
% 94.66/94.03  0 [] v2312(VarCurr)| -v2314(VarCurr).
% 94.66/94.03  0 [] -v2314(VarCurr)|v2316(VarCurr).
% 94.66/94.03  0 [] v2314(VarCurr)| -v2316(VarCurr).
% 94.66/94.03  0 [] -v2316(VarCurr)|v2318(VarCurr).
% 94.66/94.03  0 [] v2316(VarCurr)| -v2318(VarCurr).
% 94.66/94.03  0 [] -v2275(VarCurr)|v2278(VarCurr).
% 94.66/94.03  0 [] -v2275(VarCurr)|v875(VarCurr).
% 94.66/94.03  0 [] v2275(VarCurr)| -v2278(VarCurr)| -v875(VarCurr).
% 94.66/94.03  0 [] -v2278(VarCurr)|v2279(VarCurr)|v2288(VarCurr).
% 94.66/94.03  0 [] v2278(VarCurr)| -v2279(VarCurr).
% 94.66/94.03  0 [] v2278(VarCurr)| -v2288(VarCurr).
% 94.66/94.03  0 [] -v2288(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2288(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2288(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.03  0 [] -v2288(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.03  0 [] -v2288(VarCurr)| -v743(VarCurr,bitIndex1)|$T.
% 94.66/94.03  0 [] -v2288(VarCurr)|v743(VarCurr,bitIndex1)| -$T.
% 94.66/94.03  0 [] -v2288(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.66/94.03  0 [] -v2288(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.03  0 [] v2288(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2288(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] -v2279(VarCurr)|v2280(VarCurr)|v2287(VarCurr).
% 94.66/94.03  0 [] v2279(VarCurr)| -v2280(VarCurr).
% 94.66/94.03  0 [] v2279(VarCurr)| -v2287(VarCurr).
% 94.66/94.03  0 [] -v2287(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2287(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2287(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.03  0 [] -v2287(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.03  0 [] -v2287(VarCurr)| -v743(VarCurr,bitIndex1)|$T.
% 94.66/94.03  0 [] -v2287(VarCurr)|v743(VarCurr,bitIndex1)| -$T.
% 94.66/94.03  0 [] -v2287(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] -v2287(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] v2287(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] v2287(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] v2287(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] v2287(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] b1110(bitIndex3).
% 94.66/94.03  0 [] b1110(bitIndex2).
% 94.66/94.03  0 [] b1110(bitIndex1).
% 94.66/94.03  0 [] -b1110(bitIndex0).
% 94.66/94.03  0 [] -v2280(VarCurr)|v2281(VarCurr)|v2286(VarCurr).
% 94.66/94.03  0 [] v2280(VarCurr)| -v2281(VarCurr).
% 94.66/94.03  0 [] v2280(VarCurr)| -v2286(VarCurr).
% 94.66/94.03  0 [] -v2286(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2286(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2286(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.03  0 [] -v2286(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.03  0 [] -v2286(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.03  0 [] -v2286(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.03  0 [] -v2286(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.66/94.03  0 [] -v2286(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.03  0 [] v2286(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|$F|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2286(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -$F|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2286(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|$F| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2286(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -$F| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] -v2281(VarCurr)|v2282(VarCurr)|v2285(VarCurr).
% 94.66/94.03  0 [] v2281(VarCurr)| -v2282(VarCurr).
% 94.66/94.03  0 [] v2281(VarCurr)| -v2285(VarCurr).
% 94.66/94.03  0 [] -v2285(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2285(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2285(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.03  0 [] -v2285(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.03  0 [] -v2285(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.03  0 [] -v2285(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.03  0 [] -v2285(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] -v2285(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] v2285(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|$F|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2285(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -$F| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2285(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|$F|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2285(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -$F| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] -v2282(VarCurr)|v2283(VarCurr)|v2284(VarCurr).
% 94.66/94.03  0 [] v2282(VarCurr)| -v2283(VarCurr).
% 94.66/94.03  0 [] v2282(VarCurr)| -v2284(VarCurr).
% 94.66/94.03  0 [] -v2284(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2284(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2284(VarCurr)| -v743(VarCurr,bitIndex2)|$F.
% 94.66/94.03  0 [] -v2284(VarCurr)|v743(VarCurr,bitIndex2)| -$F.
% 94.66/94.03  0 [] -v2284(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.03  0 [] -v2284(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.03  0 [] -v2284(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.66/94.03  0 [] -v2284(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.03  0 [] v2284(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|$F|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2284(VarCurr)|v743(VarCurr,bitIndex3)|$T| -v743(VarCurr,bitIndex2)| -$F| -v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2284(VarCurr)| -v743(VarCurr,bitIndex3)| -$T|v743(VarCurr,bitIndex2)|$F|v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2284(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -$F| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] -v2283(VarCurr)| -v743(VarCurr,bitIndex3)|$T.
% 94.66/94.03  0 [] -v2283(VarCurr)|v743(VarCurr,bitIndex3)| -$T.
% 94.66/94.03  0 [] -v2283(VarCurr)| -v743(VarCurr,bitIndex2)|$F.
% 94.66/94.03  0 [] -v2283(VarCurr)|v743(VarCurr,bitIndex2)| -$F.
% 94.66/94.03  0 [] -v2283(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.03  0 [] -v2283(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.03  0 [] -v2283(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.66/94.03  0 [] -v2283(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.03  0 [] v2283(VarCurr)|v743(VarCurr,bitIndex3)|$T|v743(VarCurr,bitIndex2)|$F|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2283(VarCurr)|v743(VarCurr,bitIndex3)|$T| -v743(VarCurr,bitIndex2)| -$F| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2283(VarCurr)| -v743(VarCurr,bitIndex3)| -$T|v743(VarCurr,bitIndex2)|$F|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] v2283(VarCurr)| -v743(VarCurr,bitIndex3)| -$T| -v743(VarCurr,bitIndex2)| -$F| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.03  0 [] b1000(bitIndex3).
% 94.66/94.03  0 [] -b1000(bitIndex2).
% 94.66/94.04  0 [] -b1000(bitIndex1).
% 94.66/94.04  0 [] -b1000(bitIndex0).
% 94.66/94.04  0 [] -v2265(VarCurr)|v2267(VarCurr).
% 94.66/94.04  0 [] -v2265(VarCurr)|v875(VarCurr).
% 94.66/94.04  0 [] v2265(VarCurr)| -v2267(VarCurr)| -v875(VarCurr).
% 94.66/94.04  0 [] -v2267(VarCurr)|v2268(VarCurr)|v2273(VarCurr).
% 94.66/94.04  0 [] v2267(VarCurr)| -v2268(VarCurr).
% 94.66/94.04  0 [] v2267(VarCurr)| -v2273(VarCurr).
% 94.66/94.04  0 [] -v2273(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.66/94.04  0 [] -v2273(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.66/94.04  0 [] -v2273(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.04  0 [] -v2273(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.04  0 [] -v2273(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.04  0 [] -v2273(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.04  0 [] -v2273(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] -v2273(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] v2273(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2273(VarCurr)|v743(VarCurr,bitIndex3)|$F| -v743(VarCurr,bitIndex2)| -$T|v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2273(VarCurr)| -v743(VarCurr,bitIndex3)| -$F|v743(VarCurr,bitIndex2)|$T| -v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2273(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v2268(VarCurr)|v2269(VarCurr)|v2272(VarCurr).
% 94.66/94.04  0 [] v2268(VarCurr)| -v2269(VarCurr).
% 94.66/94.04  0 [] v2268(VarCurr)| -v2272(VarCurr).
% 94.66/94.04  0 [] -v2272(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.66/94.04  0 [] -v2272(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.66/94.04  0 [] -v2272(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.66/94.04  0 [] -v2272(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.66/94.04  0 [] -v2272(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.04  0 [] -v2272(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.04  0 [] -v2272(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.66/94.04  0 [] -v2272(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.04  0 [] v2272(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2272(VarCurr)|v743(VarCurr,bitIndex3)|$F| -v743(VarCurr,bitIndex2)| -$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2272(VarCurr)| -v743(VarCurr,bitIndex3)| -$F|v743(VarCurr,bitIndex2)|$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2272(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v2269(VarCurr)|v2270(VarCurr)|v2271(VarCurr).
% 94.66/94.04  0 [] v2269(VarCurr)| -v2270(VarCurr).
% 94.66/94.04  0 [] v2269(VarCurr)| -v2271(VarCurr).
% 94.66/94.04  0 [] -v2271(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.66/94.04  0 [] -v2271(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.66/94.04  0 [] -v2271(VarCurr)| -v743(VarCurr,bitIndex2)|$F.
% 94.66/94.04  0 [] -v2271(VarCurr)|v743(VarCurr,bitIndex2)| -$F.
% 94.66/94.04  0 [] -v2271(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.04  0 [] -v2271(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.04  0 [] -v2271(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] -v2271(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] v2271(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] v2271(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] v2271(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] v2271(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] -v2270(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.66/94.04  0 [] -v2270(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.66/94.04  0 [] -v2270(VarCurr)| -v743(VarCurr,bitIndex2)|$F.
% 94.66/94.04  0 [] -v2270(VarCurr)|v743(VarCurr,bitIndex2)| -$F.
% 94.66/94.04  0 [] -v2270(VarCurr)| -v743(VarCurr,bitIndex1)|$F.
% 94.66/94.04  0 [] -v2270(VarCurr)|v743(VarCurr,bitIndex1)| -$F.
% 94.66/94.04  0 [] -v2270(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.66/94.04  0 [] -v2270(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.66/94.04  0 [] v2270(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2270(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1908(VarCurr)|v1910(VarCurr).
% 94.66/94.04  0 [] v1908(VarCurr)| -v1910(VarCurr).
% 94.66/94.04  0 [] -v1910(VarCurr)|v1912(VarCurr).
% 94.66/94.04  0 [] v1910(VarCurr)| -v1912(VarCurr).
% 94.66/94.04  0 [] -v1912(VarCurr)|v1914(VarCurr).
% 94.66/94.04  0 [] v1912(VarCurr)| -v1914(VarCurr).
% 94.66/94.04  0 [] -v1914(VarCurr)|v1916(VarCurr).
% 94.66/94.04  0 [] v1914(VarCurr)| -v1916(VarCurr).
% 94.66/94.04  0 [] -v1916(VarCurr)|v1918(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1916(VarCurr)| -v1918(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1918(VarCurr,bitIndex0)|v1920(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1918(VarCurr,bitIndex0)| -v1920(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1920(VarCurr,bitIndex0)|v1922(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1920(VarCurr,bitIndex0)| -v1922(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1922(VarCurr,bitIndex0)|v1924(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1922(VarCurr,bitIndex0)| -v1924(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1924(VarCurr,bitIndex0)|v1926(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1924(VarCurr,bitIndex0)| -v1926(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1926(VarCurr,bitIndex0)|v1928(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v1926(VarCurr,bitIndex0)| -v1928(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -v1928(VarCurr,bitIndex0)|v1930(VarCurr).
% 94.66/94.04  0 [] v1928(VarCurr,bitIndex0)| -v1930(VarCurr).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)|v2214(VarNext)| -v1930(VarNext)|v1930(VarCurr).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)|v2214(VarNext)|v1930(VarNext)| -v1930(VarCurr).
% 94.66/94.04  0 [] -v2214(VarNext)| -v1930(VarNext)|v2249(VarNext).
% 94.66/94.04  0 [] -v2214(VarNext)|v1930(VarNext)| -v2249(VarNext).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)| -v2249(VarNext)|v2247(VarCurr).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)|v2249(VarNext)| -v2247(VarCurr).
% 94.66/94.04  0 [] v1932(VarCurr)| -v2247(VarCurr)|v2250(VarCurr).
% 94.66/94.04  0 [] v1932(VarCurr)|v2247(VarCurr)| -v2250(VarCurr).
% 94.66/94.04  0 [] -v1932(VarCurr)| -v2247(VarCurr)|v1955(VarCurr).
% 94.66/94.04  0 [] -v1932(VarCurr)|v2247(VarCurr)| -v1955(VarCurr).
% 94.66/94.04  0 [] v2227(VarCurr)| -v2250(VarCurr)|v2203(VarCurr).
% 94.66/94.04  0 [] v2227(VarCurr)|v2250(VarCurr)| -v2203(VarCurr).
% 94.66/94.04  0 [] -v2227(VarCurr)| -v2250(VarCurr)|v2251(VarCurr).
% 94.66/94.04  0 [] -v2227(VarCurr)|v2250(VarCurr)| -v2251(VarCurr).
% 94.66/94.04  0 [] v2230(VarCurr)|v2232(VarCurr)| -v2251(VarCurr)|v2255(VarCurr).
% 94.66/94.04  0 [] v2230(VarCurr)|v2232(VarCurr)|v2251(VarCurr)| -v2255(VarCurr).
% 94.66/94.04  0 [] -v2232(VarCurr)| -v2251(VarCurr)|v2254(VarCurr).
% 94.66/94.04  0 [] -v2232(VarCurr)|v2251(VarCurr)| -v2254(VarCurr).
% 94.66/94.04  0 [] -v2230(VarCurr)| -v2251(VarCurr)|v2252(VarCurr).
% 94.66/94.04  0 [] -v2230(VarCurr)|v2251(VarCurr)| -v2252(VarCurr).
% 94.66/94.04  0 [] v2240(VarCurr)| -v2255(VarCurr)|v2203(VarCurr).
% 94.66/94.04  0 [] v2240(VarCurr)|v2255(VarCurr)| -v2203(VarCurr).
% 94.66/94.04  0 [] -v2240(VarCurr)| -v2255(VarCurr)|$T.
% 94.66/94.04  0 [] -v2240(VarCurr)|v2255(VarCurr)| -$T.
% 94.66/94.04  0 [] v2234(VarCurr)| -v2254(VarCurr)|v2203(VarCurr).
% 94.66/94.04  0 [] v2234(VarCurr)|v2254(VarCurr)| -v2203(VarCurr).
% 94.66/94.04  0 [] -v2234(VarCurr)| -v2254(VarCurr)|$F.
% 94.66/94.04  0 [] -v2234(VarCurr)|v2254(VarCurr)| -$F.
% 94.66/94.04  0 [] v2253(VarCurr)| -v2252(VarCurr)|$F.
% 94.66/94.04  0 [] v2253(VarCurr)|v2252(VarCurr)| -$F.
% 94.66/94.04  0 [] -v2253(VarCurr)| -v2252(VarCurr)|$T.
% 94.66/94.04  0 [] -v2253(VarCurr)|v2252(VarCurr)| -$T.
% 94.66/94.04  0 [] -v2253(VarCurr)| -v1964(VarCurr)|$T.
% 94.66/94.04  0 [] -v2253(VarCurr)|v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] v2253(VarCurr)|v1964(VarCurr)|$T.
% 94.66/94.04  0 [] v2253(VarCurr)| -v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)| -v2214(VarNext)|v2215(VarNext).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)| -v2214(VarNext)|v2224(VarNext).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)|v2214(VarNext)| -v2215(VarNext)| -v2224(VarNext).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)| -v2224(VarNext)|v2222(VarCurr).
% 94.66/94.04  0 [] -nextState(VarCurr,VarNext)|v2224(VarNext)| -v2222(VarCurr).
% 94.66/94.04  0 [] -v2222(VarCurr)|v1932(VarCurr)|v2225(VarCurr).
% 94.66/94.04  0 [] v2222(VarCurr)| -v1932(VarCurr).
% 94.66/94.04  0 [] v2222(VarCurr)| -v2225(VarCurr).
% 94.66/94.04  0 [] -v2225(VarCurr)|v2226(VarCurr).
% 94.66/94.04  0 [] -v2225(VarCurr)|v2246(VarCurr).
% 94.66/94.04  0 [] v2225(VarCurr)| -v2226(VarCurr)| -v2246(VarCurr).
% 94.66/94.04  0 [] v2246(VarCurr)|v1932(VarCurr).
% 94.66/94.04  0 [] -v2246(VarCurr)| -v1932(VarCurr).
% 94.66/94.04  0 [] -v2226(VarCurr)|v2227(VarCurr)|v2244(VarCurr).
% 94.66/94.04  0 [] v2226(VarCurr)| -v2227(VarCurr).
% 94.66/94.04  0 [] v2226(VarCurr)| -v2244(VarCurr).
% 94.66/94.04  0 [] -v2244(VarCurr)|v2039(VarCurr).
% 94.66/94.04  0 [] -v2244(VarCurr)|v2245(VarCurr).
% 94.66/94.04  0 [] v2244(VarCurr)| -v2039(VarCurr)| -v2245(VarCurr).
% 94.66/94.04  0 [] v2245(VarCurr)|v2043(VarCurr).
% 94.66/94.04  0 [] -v2245(VarCurr)| -v2043(VarCurr).
% 94.66/94.04  0 [] -v2227(VarCurr)|v2228(VarCurr).
% 94.66/94.04  0 [] -v2227(VarCurr)|v2043(VarCurr).
% 94.66/94.04  0 [] v2227(VarCurr)| -v2228(VarCurr)| -v2043(VarCurr).
% 94.66/94.04  0 [] -v2228(VarCurr)|v2229(VarCurr)|v2238(VarCurr).
% 94.66/94.04  0 [] v2228(VarCurr)| -v2229(VarCurr).
% 94.66/94.04  0 [] v2228(VarCurr)| -v2238(VarCurr).
% 94.66/94.04  0 [] -v2238(VarCurr)|v2239(VarCurr).
% 94.66/94.04  0 [] -v2238(VarCurr)|v2243(VarCurr).
% 94.66/94.04  0 [] v2238(VarCurr)| -v2239(VarCurr)| -v2243(VarCurr).
% 94.66/94.04  0 [] -v2243(VarCurr)| -v2231(VarCurr,bitIndex2)|$F.
% 94.66/94.04  0 [] -v2243(VarCurr)|v2231(VarCurr,bitIndex2)| -$F.
% 94.66/94.04  0 [] -v2243(VarCurr)| -v2231(VarCurr,bitIndex1)|$F.
% 94.66/94.04  0 [] -v2243(VarCurr)|v2231(VarCurr,bitIndex1)| -$F.
% 94.66/94.04  0 [] -v2243(VarCurr)| -v2231(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] -v2243(VarCurr)|v2231(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] v2243(VarCurr)|v2231(VarCurr,bitIndex2)|$F|v2231(VarCurr,bitIndex1)|v2231(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] v2243(VarCurr)|v2231(VarCurr,bitIndex2)|$F|v2231(VarCurr,bitIndex1)| -v2231(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] v2243(VarCurr)| -v2231(VarCurr,bitIndex2)| -$F| -v2231(VarCurr,bitIndex1)|v2231(VarCurr,bitIndex0)|$T.
% 94.66/94.04  0 [] v2243(VarCurr)| -v2231(VarCurr,bitIndex2)| -$F| -v2231(VarCurr,bitIndex1)| -v2231(VarCurr,bitIndex0)| -$T.
% 94.66/94.04  0 [] -b001(bitIndex2).
% 94.66/94.04  0 [] -b001(bitIndex1).
% 94.66/94.04  0 [] b001(bitIndex0).
% 94.66/94.04  0 [] -v2239(VarCurr)|v2240(VarCurr)|v2241(VarCurr).
% 94.66/94.04  0 [] v2239(VarCurr)| -v2240(VarCurr).
% 94.66/94.04  0 [] v2239(VarCurr)| -v2241(VarCurr).
% 94.66/94.04  0 [] -v2241(VarCurr)|v2039(VarCurr).
% 94.66/94.04  0 [] -v2241(VarCurr)|v2242(VarCurr).
% 94.66/94.04  0 [] v2241(VarCurr)| -v2039(VarCurr)| -v2242(VarCurr).
% 94.66/94.04  0 [] v2242(VarCurr)|v2240(VarCurr).
% 94.66/94.04  0 [] -v2242(VarCurr)| -v2240(VarCurr).
% 94.66/94.04  0 [] -v2240(VarCurr)| -v1964(VarCurr)|$T.
% 94.66/94.04  0 [] -v2240(VarCurr)|v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] v2240(VarCurr)|v1964(VarCurr)|$T.
% 94.66/94.04  0 [] v2240(VarCurr)| -v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] -v2229(VarCurr)|v2230(VarCurr)|v2232(VarCurr).
% 94.66/94.04  0 [] v2229(VarCurr)| -v2230(VarCurr).
% 94.66/94.04  0 [] v2229(VarCurr)| -v2232(VarCurr).
% 94.66/94.04  0 [] -v2232(VarCurr)|v2233(VarCurr).
% 94.66/94.04  0 [] -v2232(VarCurr)|v2237(VarCurr).
% 94.66/94.04  0 [] v2232(VarCurr)| -v2233(VarCurr)| -v2237(VarCurr).
% 94.66/94.04  0 [] -v2237(VarCurr)| -v2231(VarCurr,bitIndex2)|$F.
% 94.66/94.04  0 [] -v2237(VarCurr)|v2231(VarCurr,bitIndex2)| -$F.
% 94.66/94.04  0 [] -v2237(VarCurr)| -v2231(VarCurr,bitIndex1)|$T.
% 94.66/94.04  0 [] -v2237(VarCurr)|v2231(VarCurr,bitIndex1)| -$T.
% 94.66/94.04  0 [] -v2237(VarCurr)| -v2231(VarCurr,bitIndex0)|$F.
% 94.66/94.04  0 [] -v2237(VarCurr)|v2231(VarCurr,bitIndex0)| -$F.
% 94.66/94.04  0 [] v2237(VarCurr)|v2231(VarCurr,bitIndex2)|$F|v2231(VarCurr,bitIndex1)|$T|v2231(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2237(VarCurr)|v2231(VarCurr,bitIndex2)|$F| -v2231(VarCurr,bitIndex1)| -$T|v2231(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2237(VarCurr)| -v2231(VarCurr,bitIndex2)| -$F|v2231(VarCurr,bitIndex1)|$T| -v2231(VarCurr,bitIndex0).
% 94.66/94.04  0 [] v2237(VarCurr)| -v2231(VarCurr,bitIndex2)| -$F| -v2231(VarCurr,bitIndex1)| -$T| -v2231(VarCurr,bitIndex0).
% 94.66/94.04  0 [] -b010(bitIndex2).
% 94.66/94.04  0 [] b010(bitIndex1).
% 94.66/94.04  0 [] -b010(bitIndex0).
% 94.66/94.04  0 [] -v2233(VarCurr)|v2234(VarCurr)|v2235(VarCurr).
% 94.66/94.04  0 [] v2233(VarCurr)| -v2234(VarCurr).
% 94.66/94.04  0 [] v2233(VarCurr)| -v2235(VarCurr).
% 94.66/94.04  0 [] -v2235(VarCurr)|v2039(VarCurr).
% 94.66/94.04  0 [] -v2235(VarCurr)|v2236(VarCurr).
% 94.66/94.04  0 [] v2235(VarCurr)| -v2039(VarCurr)| -v2236(VarCurr).
% 94.66/94.04  0 [] v2236(VarCurr)|v2234(VarCurr).
% 94.66/94.04  0 [] -v2236(VarCurr)| -v2234(VarCurr).
% 94.66/94.04  0 [] -v2234(VarCurr)| -v1964(VarCurr)|$T.
% 94.66/94.04  0 [] -v2234(VarCurr)|v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] v2234(VarCurr)|v1964(VarCurr)|$T.
% 94.66/94.04  0 [] v2234(VarCurr)| -v1964(VarCurr)| -$T.
% 94.66/94.04  0 [] -v2230(VarCurr)| -v2231(VarCurr,bitIndex2)|$T.
% 94.70/94.05  0 [] -v2230(VarCurr)|v2231(VarCurr,bitIndex2)| -$T.
% 94.70/94.05  0 [] -v2230(VarCurr)| -v2231(VarCurr,bitIndex1)|$F.
% 94.70/94.05  0 [] -v2230(VarCurr)|v2231(VarCurr,bitIndex1)| -$F.
% 94.70/94.05  0 [] -v2230(VarCurr)| -v2231(VarCurr,bitIndex0)|$F.
% 94.70/94.05  0 [] -v2230(VarCurr)|v2231(VarCurr,bitIndex0)| -$F.
% 94.70/94.05  0 [] v2230(VarCurr)|v2231(VarCurr,bitIndex2)|$T|v2231(VarCurr,bitIndex1)|$F|v2231(VarCurr,bitIndex0).
% 94.70/94.05  0 [] v2230(VarCurr)|v2231(VarCurr,bitIndex2)|$T| -v2231(VarCurr,bitIndex1)| -$F| -v2231(VarCurr,bitIndex0).
% 94.70/94.05  0 [] v2230(VarCurr)| -v2231(VarCurr,bitIndex2)| -$T|v2231(VarCurr,bitIndex1)|$F|v2231(VarCurr,bitIndex0).
% 94.70/94.05  0 [] v2230(VarCurr)| -v2231(VarCurr,bitIndex2)| -$T| -v2231(VarCurr,bitIndex1)| -$F| -v2231(VarCurr,bitIndex0).
% 94.70/94.05  0 [] b100(bitIndex2).
% 94.70/94.05  0 [] -b100(bitIndex1).
% 94.70/94.05  0 [] -b100(bitIndex0).
% 94.70/94.05  0 [] -v2231(VarCurr,bitIndex0)|v1961(VarCurr).
% 94.70/94.05  0 [] v2231(VarCurr,bitIndex0)| -v1961(VarCurr).
% 94.70/94.05  0 [] -v2231(VarCurr,bitIndex1)|v1959(VarCurr).
% 94.70/94.05  0 [] v2231(VarCurr,bitIndex1)| -v1959(VarCurr).
% 94.70/94.05  0 [] -v2231(VarCurr,bitIndex2)|v1957(VarCurr).
% 94.70/94.05  0 [] v2231(VarCurr,bitIndex2)| -v1957(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2215(VarNext)|v2216(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2215(VarNext)|v2205(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2215(VarNext)| -v2216(VarNext)| -v2205(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2216(VarNext)|v2218(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2216(VarNext)| -v2218(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2218(VarNext)|v2205(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2218(VarNext)| -v2205(VarCurr).
% 94.70/94.05  0 [] -v2205(VarCurr)|v2207(VarCurr).
% 94.70/94.05  0 [] v2205(VarCurr)| -v2207(VarCurr).
% 94.70/94.05  0 [] -v2207(VarCurr)|v2209(VarCurr).
% 94.70/94.05  0 [] v2207(VarCurr)| -v2209(VarCurr).
% 94.70/94.05  0 [] -v2209(VarCurr)|v2211(VarCurr).
% 94.70/94.05  0 [] v2209(VarCurr)| -v2211(VarCurr).
% 94.70/94.05  0 [] -v2211(VarCurr)|v2015(VarCurr).
% 94.70/94.05  0 [] v2211(VarCurr)| -v2015(VarCurr).
% 94.70/94.05  0 [] -v2203(VarCurr)|$F.
% 94.70/94.05  0 [] v2203(VarCurr)| -$F.
% 94.70/94.05  0 [] -v2043(VarCurr)|v2045(VarCurr).
% 94.70/94.05  0 [] v2043(VarCurr)| -v2045(VarCurr).
% 94.70/94.05  0 [] -v2045(VarCurr)|v2047(VarCurr).
% 94.70/94.05  0 [] v2045(VarCurr)| -v2047(VarCurr).
% 94.70/94.05  0 [] -v2047(VarCurr)|v2049(VarCurr).
% 94.70/94.05  0 [] v2047(VarCurr)| -v2049(VarCurr).
% 94.70/94.05  0 [] -v2049(VarCurr)|v2051(VarCurr).
% 94.70/94.05  0 [] -v2049(VarCurr)|v2151(VarCurr).
% 94.70/94.05  0 [] v2049(VarCurr)| -v2051(VarCurr)| -v2151(VarCurr).
% 94.70/94.05  0 [] -v2151(VarCurr)|v2153(VarCurr).
% 94.70/94.05  0 [] v2151(VarCurr)| -v2153(VarCurr).
% 94.70/94.05  0 [] -v2153(VarCurr)|v2155(VarCurr).
% 94.70/94.05  0 [] v2153(VarCurr)| -v2155(VarCurr).
% 94.70/94.05  0 [] -v2155(VarCurr)|v2157(VarCurr).
% 94.70/94.05  0 [] v2155(VarCurr)| -v2157(VarCurr).
% 94.70/94.05  0 [] -v2157(VarCurr)|v2159(VarCurr).
% 94.70/94.05  0 [] v2157(VarCurr)| -v2159(VarCurr).
% 94.70/94.05  0 [] -v2159(VarCurr)|v2161(VarCurr).
% 94.70/94.05  0 [] v2159(VarCurr)| -v2161(VarCurr).
% 94.70/94.05  0 [] -v2161(VarCurr)|v2163(VarCurr).
% 94.70/94.05  0 [] v2161(VarCurr)| -v2163(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2190(VarNext)| -v2163(VarNext)|v2163(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2190(VarNext)|v2163(VarNext)| -v2163(VarCurr).
% 94.70/94.05  0 [] -v2190(VarNext)| -v2163(VarNext)|v2198(VarNext).
% 94.70/94.05  0 [] -v2190(VarNext)|v2163(VarNext)| -v2198(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2198(VarNext)|v2196(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2198(VarNext)| -v2196(VarCurr).
% 94.70/94.05  0 [] v2035(VarCurr)| -v2196(VarCurr)|v2165(VarCurr).
% 94.70/94.05  0 [] v2035(VarCurr)|v2196(VarCurr)| -v2165(VarCurr).
% 94.70/94.05  0 [] -v2035(VarCurr)| -v2196(VarCurr)|$F.
% 94.70/94.05  0 [] -v2035(VarCurr)|v2196(VarCurr)| -$F.
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2190(VarNext)|v2191(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2190(VarNext)| -v2191(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2191(VarNext)|v2193(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2191(VarNext)|v2013(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2191(VarNext)| -v2193(VarNext)| -v2013(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2193(VarNext)|v2028(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2193(VarNext)| -v2028(VarNext).
% 94.70/94.05  0 [] -v2163(constB0)|$F.
% 94.70/94.05  0 [] v2163(constB0)| -$F.
% 94.70/94.05  0 [] -v2165(VarCurr)|v2167(VarCurr).
% 94.70/94.05  0 [] v2165(VarCurr)| -v2167(VarCurr).
% 94.70/94.05  0 [] -v2167(VarCurr)|v2169(VarCurr).
% 94.70/94.05  0 [] v2167(VarCurr)| -v2169(VarCurr).
% 94.70/94.05  0 [] -v2169(VarCurr)|v2171(VarCurr).
% 94.70/94.05  0 [] v2169(VarCurr)| -v2171(VarCurr).
% 94.70/94.05  0 [] -v2171(VarCurr)|v2173(VarCurr).
% 94.70/94.05  0 [] v2171(VarCurr)| -v2173(VarCurr).
% 94.70/94.05  0 [] -v2173(VarCurr)|v2175(VarCurr).
% 94.70/94.05  0 [] v2173(VarCurr)| -v2175(VarCurr).
% 94.70/94.05  0 [] -v2175(VarCurr)|v2177(VarCurr).
% 94.70/94.05  0 [] v2175(VarCurr)| -v2177(VarCurr).
% 94.70/94.05  0 [] -v2177(VarCurr)|v2179(VarCurr).
% 94.70/94.05  0 [] v2177(VarCurr)| -v2179(VarCurr).
% 94.70/94.05  0 [] -v2179(VarCurr)|v2181(VarCurr).
% 94.70/94.05  0 [] v2179(VarCurr)| -v2181(VarCurr).
% 94.70/94.05  0 [] -v2181(VarCurr)|v2183(VarCurr).
% 94.70/94.05  0 [] v2181(VarCurr)| -v2183(VarCurr).
% 94.70/94.05  0 [] -v2183(VarCurr)|v2185(VarCurr).
% 94.70/94.05  0 [] v2183(VarCurr)| -v2185(VarCurr).
% 94.70/94.05  0 [] -v2185(VarCurr)|v2187(VarCurr).
% 94.70/94.05  0 [] v2185(VarCurr)| -v2187(VarCurr).
% 94.70/94.05  0 [] -v2187(constB0)|$F.
% 94.70/94.05  0 [] v2187(constB0)| -$F.
% 94.70/94.05  0 [] -v2051(VarCurr)|v2053(VarCurr).
% 94.70/94.05  0 [] v2051(VarCurr)| -v2053(VarCurr).
% 94.70/94.05  0 [] -v2053(VarCurr)|v2055(VarCurr).
% 94.70/94.05  0 [] v2053(VarCurr)| -v2055(VarCurr).
% 94.70/94.05  0 [] -v2055(VarCurr)|v2057(VarCurr).
% 94.70/94.05  0 [] v2055(VarCurr)| -v2057(VarCurr).
% 94.70/94.05  0 [] -v2057(VarCurr)|v2059(VarCurr).
% 94.70/94.05  0 [] v2057(VarCurr)| -v2059(VarCurr).
% 94.70/94.05  0 [] -v2059(VarCurr)|v2061(VarCurr).
% 94.70/94.05  0 [] v2059(VarCurr)| -v2061(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2136(VarNext)| -v2061(VarNext)|v2061(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2136(VarNext)|v2061(VarNext)| -v2061(VarCurr).
% 94.70/94.05  0 [] -v2136(VarNext)| -v2061(VarNext)|v2144(VarNext).
% 94.70/94.05  0 [] -v2136(VarNext)|v2061(VarNext)| -v2144(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2144(VarNext)|v2142(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2144(VarNext)| -v2142(VarCurr).
% 94.70/94.05  0 [] v2145(VarCurr)| -v2142(VarCurr)|v2146(VarCurr).
% 94.70/94.05  0 [] v2145(VarCurr)|v2142(VarCurr)| -v2146(VarCurr).
% 94.70/94.05  0 [] -v2145(VarCurr)| -v2142(VarCurr)|$F.
% 94.70/94.05  0 [] -v2145(VarCurr)|v2142(VarCurr)| -$F.
% 94.70/94.05  0 [] -v2146(VarCurr)|v2147(VarCurr).
% 94.70/94.05  0 [] -v2146(VarCurr)|v2065(VarCurr).
% 94.70/94.05  0 [] v2146(VarCurr)| -v2147(VarCurr)| -v2065(VarCurr).
% 94.70/94.05  0 [] -v2147(VarCurr)|v2063(VarCurr).
% 94.70/94.05  0 [] v2147(VarCurr)| -v2063(VarCurr).
% 94.70/94.05  0 [] -v2063(constB0)|$F.
% 94.70/94.05  0 [] v2063(constB0)| -$F.
% 94.70/94.05  0 [] v2145(VarCurr)|v1984(VarCurr).
% 94.70/94.05  0 [] -v2145(VarCurr)| -v1984(VarCurr).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2136(VarNext)|v2137(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2136(VarNext)| -v2137(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2137(VarNext)|v2138(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2137(VarNext)|v2013(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2137(VarNext)| -v2138(VarNext)| -v2013(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)|v2138(VarNext)|v2028(VarNext).
% 94.70/94.05  0 [] -nextState(VarCurr,VarNext)| -v2138(VarNext)| -v2028(VarNext).
% 94.70/94.05  0 [] -v2061(constB0)|$F.
% 94.70/94.05  0 [] v2061(constB0)| -$F.
% 94.70/94.05  0 [] v2128(VarCurr)| -v2065(VarCurr)|v2129(VarCurr).
% 94.70/94.05  0 [] v2128(VarCurr)|v2065(VarCurr)| -v2129(VarCurr).
% 94.70/94.05  0 [] -v2128(VarCurr)| -v2065(VarCurr)|$F.
% 94.70/94.05  0 [] -v2128(VarCurr)|v2065(VarCurr)| -$F.
% 94.70/94.05  0 [] v2130(VarCurr)|v2132(VarCurr)| -v2129(VarCurr)|$F.
% 94.70/94.05  0 [] v2130(VarCurr)|v2132(VarCurr)|v2129(VarCurr)| -$F.
% 94.70/94.05  0 [] -v2132(VarCurr)| -v2129(VarCurr)|v2133(VarCurr).
% 94.70/94.05  0 [] -v2132(VarCurr)|v2129(VarCurr)| -v2133(VarCurr).
% 94.70/94.05  0 [] -v2130(VarCurr)| -v2129(VarCurr)|v2131(VarCurr).
% 94.70/94.05  0 [] -v2130(VarCurr)|v2129(VarCurr)| -v2131(VarCurr).
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex26)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex26)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex25)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex25)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex24)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex24)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex23)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex23)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex22)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex22)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex21)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex21)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex20)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex20)| -$F.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex19)|$T.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex19)| -$T.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex18)|$T.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex18)| -$T.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex17)|$T.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex17)| -$T.
% 94.70/94.05  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex16)|$F.
% 94.70/94.05  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex16)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex15)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex15)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex14)|$T.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex14)| -$T.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex13)|$T.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex13)| -$T.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex12)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex12)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex11)|$T.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex11)| -$T.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex10)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex10)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex9)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex9)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex8)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex8)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex7)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex7)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex6)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex6)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex5)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex5)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex4)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex4)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex3)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex3)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex2)|$F.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex2)| -$F.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex1)|$T.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex1)| -$T.
% 94.70/94.06  0 [] -v2133(VarCurr)| -v2101(VarCurr,bitIndex0)|$T.
% 94.70/94.06  0 [] -v2133(VarCurr)|v2101(VarCurr,bitIndex0)| -$T.
% 94.70/94.06  0 [] v2133(VarCurr)|v2101(VarCurr,bitIndex26)|$F|v2101(VarCurr,bitIndex25)|v2101(VarCurr,bitIndex24)|v2101(VarCurr,bitIndex23)|v2101(VarCurr,bitIndex22)|v2101(VarCurr,bitIndex21)|v2101(VarCurr,bitIndex20)|v2101(VarCurr,bitIndex19)|$T|v2101(VarCurr,bitIndex18)|v2101(VarCurr,bitIndex17)|v2101(VarCurr,bitIndex16)|v2101(VarCurr,bitIndex15)|v2101(VarCurr,bitIndex14)|v2101(VarCurr,bitIndex13)|v2101(VarCurr,bitIndex12)|v2101(VarCurr,bitIndex11)|v2101(VarCurr,bitIndex10)|v2101(VarCurr,bitIndex9)|v2101(VarCurr,bitIndex8)|v2101(VarCurr,bitIndex7)|v2101(VarCurr,bitIndex6)|v2101(VarCurr,bitIndex5)|v2101(VarCurr,bitIndex4)|v2101(VarCurr,bitIndex3)|v2101(VarCurr,bitIndex2)|v2101(VarCurr,bitIndex1)|v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2133(VarCurr)|v2101(VarCurr,bitIndex26)|$F|v2101(VarCurr,bitIndex25)|v2101(VarCurr,bitIndex24)|v2101(VarCurr,bitIndex23)|v2101(VarCurr,bitIndex22)|v2101(VarCurr,bitIndex21)|v2101(VarCurr,bitIndex20)| -v2101(VarCurr,bitIndex19)| -$T| -v2101(VarCurr,bitIndex18)| -v2101(VarCurr,bitIndex17)|v2101(VarCurr,bitIndex16)|v2101(VarCurr,bitIndex15)| -v2101(VarCurr,bitIndex14)| -v2101(VarCurr,bitIndex13)|v2101(VarCurr,bitIndex12)| -v2101(VarCurr,bitIndex11)|v2101(VarCurr,bitIndex10)|v2101(VarCurr,bitIndex9)|v2101(VarCurr,bitIndex8)|v2101(VarCurr,bitIndex7)|v2101(VarCurr,bitIndex6)|v2101(VarCurr,bitIndex5)|v2101(VarCurr,bitIndex4)|v2101(VarCurr,bitIndex3)|v2101(VarCurr,bitIndex2)| -v2101(VarCurr,bitIndex1)| -v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2133(VarCurr)| -v2101(VarCurr,bitIndex26)| -$F| -v2101(VarCurr,bitIndex25)| -v2101(VarCurr,bitIndex24)| -v2101(VarCurr,bitIndex23)| -v2101(VarCurr,bitIndex22)| -v2101(VarCurr,bitIndex21)| -v2101(VarCurr,bitIndex20)|v2101(VarCurr,bitIndex19)|$T|v2101(VarCurr,bitIndex18)|v2101(VarCurr,bitIndex17)| -v2101(VarCurr,bitIndex16)| -v2101(VarCurr,bitIndex15)|v2101(VarCurr,bitIndex14)|v2101(VarCurr,bitIndex13)| -v2101(VarCurr,bitIndex12)|v2101(VarCurr,bitIndex11)| -v2101(VarCurr,bitIndex10)| -v2101(VarCurr,bitIndex9)| -v2101(VarCurr,bitIndex8)| -v2101(VarCurr,bitIndex7)| -v2101(VarCurr,bitIndex6)| -v2101(VarCurr,bitIndex5)| -v2101(VarCurr,bitIndex4)| -v2101(VarCurr,bitIndex3)| -v2101(VarCurr,bitIndex2)|v2101(VarCurr,bitIndex1)|v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2133(VarCurr)| -v2101(VarCurr,bitIndex26)| -$F| -v2101(VarCurr,bitIndex25)| -v2101(VarCurr,bitIndex24)| -v2101(VarCurr,bitIndex23)| -v2101(VarCurr,bitIndex22)| -v2101(VarCurr,bitIndex21)| -v2101(VarCurr,bitIndex20)| -v2101(VarCurr,bitIndex19)| -$T| -v2101(VarCurr,bitIndex18)| -v2101(VarCurr,bitIndex17)| -v2101(VarCurr,bitIndex16)| -v2101(VarCurr,bitIndex15)| -v2101(VarCurr,bitIndex14)| -v2101(VarCurr,bitIndex13)| -v2101(VarCurr,bitIndex12)| -v2101(VarCurr,bitIndex11)| -v2101(VarCurr,bitIndex10)| -v2101(VarCurr,bitIndex9)| -v2101(VarCurr,bitIndex8)| -v2101(VarCurr,bitIndex7)| -v2101(VarCurr,bitIndex6)| -v2101(VarCurr,bitIndex5)| -v2101(VarCurr,bitIndex4)| -v2101(VarCurr,bitIndex3)| -v2101(VarCurr,bitIndex2)| -v2101(VarCurr,bitIndex1)| -v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex26).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex25).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex24).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex23).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex22).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex21).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex20).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex19).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex18).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex17).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex16).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex15).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex14).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex13).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex12).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex11).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex10).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex9).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex8).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex7).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex6).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex5).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex4).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex3).
% 94.70/94.06  0 [] -b000000011100110100000000011(bitIndex2).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex1).
% 94.70/94.06  0 [] b000000011100110100000000011(bitIndex0).
% 94.70/94.06  0 [] -v2132(VarCurr)| -v2091(VarCurr)|$T.
% 94.70/94.06  0 [] -v2132(VarCurr)|v2091(VarCurr)| -$T.
% 94.70/94.06  0 [] v2132(VarCurr)|v2091(VarCurr)|$T.
% 94.70/94.06  0 [] v2132(VarCurr)| -v2091(VarCurr)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex26)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex26)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex25)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex25)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex24)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex24)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex23)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex23)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex22)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex22)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex21)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex21)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex20)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex20)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex19)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex19)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex18)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex18)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex17)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex17)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex16)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex16)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex15)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex15)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex14)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex14)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex13)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex13)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex12)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex12)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex11)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex11)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex10)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex10)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex9)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex9)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex8)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex8)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex7)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex7)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex6)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex6)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex5)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex5)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex4)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex4)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex3)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex3)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex2)|$F.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex2)| -$F.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex1)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex1)| -$T.
% 94.70/94.06  0 [] -v2131(VarCurr)| -v2101(VarCurr,bitIndex0)|$T.
% 94.70/94.06  0 [] -v2131(VarCurr)|v2101(VarCurr,bitIndex0)| -$T.
% 94.70/94.06  0 [] v2131(VarCurr)|v2101(VarCurr,bitIndex26)|$F|v2101(VarCurr,bitIndex25)|v2101(VarCurr,bitIndex24)|v2101(VarCurr,bitIndex23)|v2101(VarCurr,bitIndex22)|v2101(VarCurr,bitIndex21)|v2101(VarCurr,bitIndex20)|v2101(VarCurr,bitIndex19)|$T|v2101(VarCurr,bitIndex18)|v2101(VarCurr,bitIndex17)|v2101(VarCurr,bitIndex16)|v2101(VarCurr,bitIndex15)|v2101(VarCurr,bitIndex14)|v2101(VarCurr,bitIndex13)|v2101(VarCurr,bitIndex12)|v2101(VarCurr,bitIndex11)|v2101(VarCurr,bitIndex10)|v2101(VarCurr,bitIndex9)|v2101(VarCurr,bitIndex8)|v2101(VarCurr,bitIndex7)|v2101(VarCurr,bitIndex6)|v2101(VarCurr,bitIndex5)|v2101(VarCurr,bitIndex4)|v2101(VarCurr,bitIndex3)|v2101(VarCurr,bitIndex2)|v2101(VarCurr,bitIndex1)|v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2131(VarCurr)|v2101(VarCurr,bitIndex26)|$F|v2101(VarCurr,bitIndex25)|v2101(VarCurr,bitIndex24)|v2101(VarCurr,bitIndex23)|v2101(VarCurr,bitIndex22)|v2101(VarCurr,bitIndex21)|v2101(VarCurr,bitIndex20)| -v2101(VarCurr,bitIndex19)| -$T| -v2101(VarCurr,bitIndex18)|v2101(VarCurr,bitIndex17)|v2101(VarCurr,bitIndex16)|v2101(VarCurr,bitIndex15)| -v2101(VarCurr,bitIndex14)| -v2101(VarCurr,bitIndex13)|v2101(VarCurr,bitIndex12)| -v2101(VarCurr,bitIndex11)|v2101(VarCurr,bitIndex10)|v2101(VarCurr,bitIndex9)|v2101(VarCurr,bitIndex8)|v2101(VarCurr,bitIndex7)|v2101(VarCurr,bitIndex6)|v2101(VarCurr,bitIndex5)|v2101(VarCurr,bitIndex4)|v2101(VarCurr,bitIndex3)|v2101(VarCurr,bitIndex2)| -v2101(VarCurr,bitIndex1)| -v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2131(VarCurr)| -v2101(VarCurr,bitIndex26)| -$F| -v2101(VarCurr,bitIndex25)| -v2101(VarCurr,bitIndex24)| -v2101(VarCurr,bitIndex23)| -v2101(VarCurr,bitIndex22)| -v2101(VarCurr,bitIndex21)| -v2101(VarCurr,bitIndex20)|v2101(VarCurr,bitIndex19)|$T|v2101(VarCurr,bitIndex18)| -v2101(VarCurr,bitIndex17)| -v2101(VarCurr,bitIndex16)| -v2101(VarCurr,bitIndex15)|v2101(VarCurr,bitIndex14)|v2101(VarCurr,bitIndex13)| -v2101(VarCurr,bitIndex12)|v2101(VarCurr,bitIndex11)| -v2101(VarCurr,bitIndex10)| -v2101(VarCurr,bitIndex9)| -v2101(VarCurr,bitIndex8)| -v2101(VarCurr,bitIndex7)| -v2101(VarCurr,bitIndex6)| -v2101(VarCurr,bitIndex5)| -v2101(VarCurr,bitIndex4)| -v2101(VarCurr,bitIndex3)| -v2101(VarCurr,bitIndex2)|v2101(VarCurr,bitIndex1)|v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] v2131(VarCurr)| -v2101(VarCurr,bitIndex26)| -$F| -v2101(VarCurr,bitIndex25)| -v2101(VarCurr,bitIndex24)| -v2101(VarCurr,bitIndex23)| -v2101(VarCurr,bitIndex22)| -v2101(VarCurr,bitIndex21)| -v2101(VarCurr,bitIndex20)| -v2101(VarCurr,bitIndex19)| -$T| -v2101(VarCurr,bitIndex18)| -v2101(VarCurr,bitIndex17)| -v2101(VarCurr,bitIndex16)| -v2101(VarCurr,bitIndex15)| -v2101(VarCurr,bitIndex14)| -v2101(VarCurr,bitIndex13)| -v2101(VarCurr,bitIndex12)| -v2101(VarCurr,bitIndex11)| -v2101(VarCurr,bitIndex10)| -v2101(VarCurr,bitIndex9)| -v2101(VarCurr,bitIndex8)| -v2101(VarCurr,bitIndex7)| -v2101(VarCurr,bitIndex6)| -v2101(VarCurr,bitIndex5)| -v2101(VarCurr,bitIndex4)| -v2101(VarCurr,bitIndex3)| -v2101(VarCurr,bitIndex2)| -v2101(VarCurr,bitIndex1)| -v2101(VarCurr,bitIndex0).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex26).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex25).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex24).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex23).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex22).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex21).
% 94.70/94.06  0 [] -b000000011000110100000000011(bitIndex20).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex19).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex18).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex17).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex16).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex15).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex14).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex13).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex12).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex11).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex10).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex9).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex8).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex7).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex6).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex5).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex4).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex3).
% 94.70/94.07  0 [] -b000000011000110100000000011(bitIndex2).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex1).
% 94.70/94.07  0 [] b000000011000110100000000011(bitIndex0).
% 94.70/94.07  0 [] -v2130(VarCurr)| -v2091(VarCurr)|$F.
% 94.70/94.07  0 [] -v2130(VarCurr)|v2091(VarCurr)| -$F.
% 94.70/94.07  0 [] v2130(VarCurr)|v2091(VarCurr)|$F.
% 94.70/94.07  0 [] v2130(VarCurr)| -v2091(VarCurr)| -$F.
% 94.70/94.07  0 [] v2128(VarCurr)|v2067(VarCurr).
% 94.70/94.07  0 [] -v2128(VarCurr)| -v2067(VarCurr).
% 94.70/94.07  0 [] -range_26_0(B)| -v2101(VarCurr,B)|v2103(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2101(VarCurr,B)| -v2103(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2103(VarCurr,B)|v2105(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2103(VarCurr,B)| -v2105(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2105(VarCurr,B)|v2107(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2105(VarCurr,B)| -v2107(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2107(VarCurr,B)|v2109(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2107(VarCurr,B)| -v2109(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2109(VarCurr,B)|v2111(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2109(VarCurr,B)| -v2111(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2111(VarCurr,B)|v2113(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2111(VarCurr,B)| -v2113(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2113(VarCurr,B)|v2115(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2113(VarCurr,B)| -v2115(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2115(VarCurr,B)|v2117(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2115(VarCurr,B)| -v2117(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2117(VarCurr,B)|v2119(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2117(VarCurr,B)| -v2119(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2119(VarCurr,B)|v2121(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2119(VarCurr,B)| -v2121(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2121(VarCurr,B)|v2123(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)|v2121(VarCurr,B)| -v2123(VarCurr,B).
% 94.70/94.07  0 [] -range_26_0(B)| -v2123(constB0,B)|$F.
% 94.70/94.07  0 [] -range_26_0(B)|v2123(constB0,B)| -$F.
% 94.70/94.07  0 [] -range_26_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex0!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex1!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex2!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex3!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex4!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex5!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex6!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex7!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex8!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex9!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex10!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex11!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex12!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex13!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex14!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex15!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex16!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex17!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex18!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex19!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex20!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex21!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex22!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex23!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex24!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex25!=B.
% 94.70/94.07  0 [] range_26_0(B)|bitIndex26!=B.
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex26).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex25).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex24).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex23).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex22).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex21).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex20).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex19).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex18).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex17).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex16).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex15).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex14).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex13).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex12).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex11).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex10).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex9).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex8).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex7).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex6).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex5).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex4).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex3).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex2).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex1).
% 94.70/94.07  0 [] -b000000000000000000000000000(bitIndex0).
% 94.70/94.07  0 [] -v2091(VarCurr)|v2093(VarCurr).
% 94.70/94.07  0 [] v2091(VarCurr)| -v2093(VarCurr).
% 94.70/94.07  0 [] -v2093(VarCurr)|v2095(VarCurr).
% 94.70/94.07  0 [] v2093(VarCurr)| -v2095(VarCurr).
% 94.70/94.07  0 [] -v2095(VarCurr)|v2097(VarCurr).
% 94.70/94.07  0 [] v2095(VarCurr)| -v2097(VarCurr).
% 94.70/94.07  0 [] -v2097(VarCurr)|v2099(VarCurr).
% 94.70/94.07  0 [] v2097(VarCurr)| -v2099(VarCurr).
% 94.70/94.07  0 [] -v2067(VarCurr)|v2069(VarCurr).
% 94.70/94.07  0 [] v2067(VarCurr)| -v2069(VarCurr).
% 94.70/94.07  0 [] -v2069(VarCurr)|v2071(VarCurr).
% 94.70/94.07  0 [] v2069(VarCurr)| -v2071(VarCurr).
% 94.70/94.07  0 [] -v2071(VarCurr)|v2073(VarCurr).
% 94.70/94.07  0 [] v2071(VarCurr)| -v2073(VarCurr).
% 94.70/94.07  0 [] -v2073(VarCurr)|v2075(VarCurr).
% 94.70/94.07  0 [] v2073(VarCurr)| -v2075(VarCurr).
% 94.70/94.07  0 [] -v2075(VarCurr)|v2077(VarCurr).
% 94.70/94.07  0 [] v2075(VarCurr)| -v2077(VarCurr).
% 94.70/94.07  0 [] -v2077(VarCurr)|v2079(VarCurr).
% 94.70/94.07  0 [] v2077(VarCurr)| -v2079(VarCurr).
% 94.70/94.07  0 [] -v2079(VarCurr)|v2081(VarCurr).
% 94.70/94.07  0 [] v2079(VarCurr)| -v2081(VarCurr).
% 94.70/94.07  0 [] -v2081(VarCurr)|v2083(VarCurr).
% 94.70/94.07  0 [] v2081(VarCurr)| -v2083(VarCurr).
% 94.70/94.07  0 [] -v2083(VarCurr)|v2085(VarCurr).
% 94.70/94.07  0 [] v2083(VarCurr)| -v2085(VarCurr).
% 94.70/94.07  0 [] -v2085(VarCurr)|v2087(VarCurr).
% 94.70/94.07  0 [] v2085(VarCurr)| -v2087(VarCurr).
% 94.70/94.07  0 [] -v2087(VarCurr)|v2089(VarCurr).
% 94.70/94.07  0 [] v2087(VarCurr)| -v2089(VarCurr).
% 94.70/94.07  0 [] -v2089(constB0)|$T.
% 94.70/94.07  0 [] v2089(constB0)| -$T.
% 94.70/94.07  0 [] -v2039(VarCurr)|$F.
% 94.70/94.07  0 [] v2039(VarCurr)| -$F.
% 94.70/94.07  0 [] -v1964(VarCurr)|v1966(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1964(VarCurr)| -v1966(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1966(VarCurr,bitIndex0)|v1968(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1966(VarCurr,bitIndex0)| -v1968(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1968(VarCurr,bitIndex0)|v1970(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1968(VarCurr,bitIndex0)| -v1970(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1970(VarCurr,bitIndex0)|v1972(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1970(VarCurr,bitIndex0)| -v1972(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1972(VarCurr,bitIndex0)|v1974(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1972(VarCurr,bitIndex0)| -v1974(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1974(VarCurr,bitIndex0)|v1976(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1974(VarCurr,bitIndex0)| -v1976(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1976(VarCurr,bitIndex0)|v1978(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1976(VarCurr,bitIndex0)| -v1978(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1978(VarCurr,bitIndex0)|v1980(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1978(VarCurr,bitIndex0)| -v1980(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1980(VarCurr,bitIndex0)|v1982(VarCurr,bitIndex0).
% 94.70/94.07  0 [] v1980(VarCurr,bitIndex0)| -v1982(VarCurr,bitIndex0).
% 94.70/94.07  0 [] -v1982(VarNext,bitIndex0)|v2023(VarNext,bitIndex0).
% 94.70/94.07  0 [] v1982(VarNext,bitIndex0)| -v2023(VarNext,bitIndex0).
% 94.70/94.07  0 [] -nextState(VarCurr,VarNext)|v2024(VarNext)| -range_63_0(B)| -v2023(VarNext,B)|v1982(VarCurr,B).
% 94.70/94.07  0 [] -nextState(VarCurr,VarNext)|v2024(VarNext)| -range_63_0(B)|v2023(VarNext,B)| -v1982(VarCurr,B).
% 94.70/94.07  0 [] -v2024(VarNext)| -range_63_0(B)| -v2023(VarNext,B)|v2034(VarNext,B).
% 94.70/94.07  0 [] -v2024(VarNext)| -range_63_0(B)|v2023(VarNext,B)| -v2034(VarNext,B).
% 94.70/94.07  0 [] -nextState(VarCurr,VarNext)| -range_63_0(B)| -v2034(VarNext,B)|v2032(VarCurr,B).
% 94.70/94.07  0 [] -nextState(VarCurr,VarNext)| -range_63_0(B)|v2034(VarNext,B)| -v2032(VarCurr,B).
% 94.70/94.07  0 [] v2035(VarCurr)| -range_63_0(B)| -v2032(VarCurr,B)|v1987(VarCurr,B).
% 94.70/94.07  0 [] v2035(VarCurr)| -range_63_0(B)|v2032(VarCurr,B)| -v1987(VarCurr,B).
% 94.70/94.07  0 [] -v2035(VarCurr)| -range_63_0(B)| -v2032(VarCurr,B)|$F.
% 94.70/94.07  0 [] -v2035(VarCurr)| -range_63_0(B)|v2032(VarCurr,B)| -$F.
% 94.70/94.07  0 [] -range_63_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex0!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex1!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex2!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex3!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex4!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex5!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex6!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex7!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex8!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex9!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex10!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex11!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex12!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex13!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex14!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex15!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex16!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex17!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex18!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex19!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex20!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex21!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex22!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex23!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex24!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex25!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex26!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex27!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex28!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex29!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex30!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex31!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex32!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex33!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex34!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex35!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex36!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex37!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex38!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex39!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex40!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex41!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex42!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex43!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex44!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex45!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex46!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex47!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex48!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex49!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex50!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex51!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex52!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex53!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex54!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex55!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex56!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex57!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex58!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex59!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex60!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex61!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex62!=B.
% 94.70/94.07  0 [] range_63_0(B)|bitIndex63!=B.
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 94.70/94.07  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 94.70/94.08  0 [] -b0000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 94.70/94.08  0 [] v2035(VarCurr)|v1984(VarCurr).
% 94.70/94.08  0 [] -v2035(VarCurr)| -v1984(VarCurr).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)| -v2024(VarNext)|v2025(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)|v2024(VarNext)| -v2025(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)| -v2025(VarNext)|v2026(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)| -v2025(VarNext)|v2013(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)|v2025(VarNext)| -v2026(VarNext)| -v2013(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)|v2026(VarNext)|v2028(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)| -v2026(VarNext)| -v2028(VarNext).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)| -v2028(VarNext)|v2013(VarCurr).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)|v2028(VarNext)| -v2013(VarCurr).
% 94.70/94.08  0 [] -v1982(constB0,bitIndex1).
% 94.70/94.08  0 [] -v1982(constB0,bitIndex0).
% 94.70/94.08  0 [] -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex1).
% 94.70/94.08  0 [] -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex0).
% 94.70/94.08  0 [] -v2013(VarCurr)|v2015(VarCurr).
% 94.70/94.08  0 [] v2013(VarCurr)| -v2015(VarCurr).
% 94.70/94.08  0 [] -v2015(VarCurr)|v2017(VarCurr).
% 94.70/94.08  0 [] v2015(VarCurr)| -v2017(VarCurr).
% 94.70/94.08  0 [] -v2017(VarCurr)|v2019(VarCurr).
% 94.70/94.08  0 [] v2017(VarCurr)| -v2019(VarCurr).
% 94.70/94.08  0 [] -v2019(VarCurr)|v1(VarCurr).
% 94.70/94.08  0 [] v2019(VarCurr)| -v1(VarCurr).
% 94.70/94.08  0 [] -v1987(VarCurr,bitIndex0)|v1989(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1987(VarCurr,bitIndex0)| -v1989(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1989(VarCurr,bitIndex0)|v1991(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1989(VarCurr,bitIndex0)| -v1991(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1991(VarCurr,bitIndex0)|v1993(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1991(VarCurr,bitIndex0)| -v1993(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1993(VarCurr,bitIndex0)|v1995(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1993(VarCurr,bitIndex0)| -v1995(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1995(VarCurr,bitIndex0)|v1997(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1995(VarCurr,bitIndex0)| -v1997(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1997(VarCurr,bitIndex0)|v1999(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1997(VarCurr,bitIndex0)| -v1999(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v1999(VarCurr,bitIndex0)|v2001(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v1999(VarCurr,bitIndex0)| -v2001(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v2001(VarCurr,bitIndex0)|v2003(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v2001(VarCurr,bitIndex0)| -v2003(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v2003(VarCurr,bitIndex0)|v2005(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v2003(VarCurr,bitIndex0)| -v2005(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v2005(VarCurr,bitIndex0)|v2007(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v2005(VarCurr,bitIndex0)| -v2007(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v2007(VarCurr,bitIndex0)|v2009(VarCurr,bitIndex0).
% 94.70/94.08  0 [] v2007(VarCurr,bitIndex0)| -v2009(VarCurr,bitIndex0).
% 94.70/94.08  0 [] -v2009(constB0,bitIndex1).
% 94.70/94.08  0 [] -v2009(constB0,bitIndex0).
% 94.70/94.08  0 [] -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex1).
% 94.70/94.08  0 [] -bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx00(bitIndex0).
% 94.70/94.08  0 [] -v1984(VarCurr)|v1948(VarCurr).
% 94.70/94.08  0 [] v1984(VarCurr)| -v1948(VarCurr).
% 94.70/94.08  0 [] -v1961(VarCurr)|$F.
% 94.70/94.08  0 [] v1961(VarCurr)| -$F.
% 94.70/94.08  0 [] -v1959(VarCurr)|$F.
% 94.70/94.08  0 [] v1959(VarCurr)| -$F.
% 94.70/94.08  0 [] -v1957(VarCurr)|$T.
% 94.70/94.08  0 [] v1957(VarCurr)| -$T.
% 94.70/94.08  0 [] -v1955(VarCurr)|$F.
% 94.70/94.08  0 [] v1955(VarCurr)| -$F.
% 94.70/94.08  0 [] -v1932(VarCurr)|v1934(VarCurr).
% 94.70/94.08  0 [] v1932(VarCurr)| -v1934(VarCurr).
% 94.70/94.08  0 [] v1934(VarCurr)|v1936(VarCurr).
% 94.70/94.08  0 [] -v1934(VarCurr)| -v1936(VarCurr).
% 94.70/94.08  0 [] -v1936(VarCurr)|v1938(VarCurr).
% 94.70/94.08  0 [] v1936(VarCurr)| -v1938(VarCurr).
% 94.70/94.08  0 [] -v1938(VarCurr)|v1940(VarCurr).
% 94.70/94.08  0 [] v1938(VarCurr)| -v1940(VarCurr).
% 94.70/94.08  0 [] -v1940(VarCurr)|v1942(VarCurr).
% 94.70/94.08  0 [] v1940(VarCurr)| -v1942(VarCurr).
% 94.70/94.08  0 [] -v1942(VarCurr)|v1944(VarCurr).
% 94.70/94.08  0 [] v1942(VarCurr)| -v1944(VarCurr).
% 94.70/94.08  0 [] -v1944(VarCurr)|v1946(VarCurr).
% 94.70/94.08  0 [] v1944(VarCurr)| -v1946(VarCurr).
% 94.70/94.08  0 [] -v1946(VarCurr)|v1948(VarCurr).
% 94.70/94.08  0 [] v1946(VarCurr)| -v1948(VarCurr).
% 94.70/94.08  0 [] -v1948(VarCurr)|v1950(VarCurr).
% 94.70/94.08  0 [] v1948(VarCurr)| -v1950(VarCurr).
% 94.70/94.08  0 [] -v1950(VarCurr)|v1952(VarCurr).
% 94.70/94.08  0 [] v1950(VarCurr)| -v1952(VarCurr).
% 94.70/94.08  0 [] -v1952(VarCurr)|v16(VarCurr).
% 94.70/94.08  0 [] v1952(VarCurr)| -v16(VarCurr).
% 94.70/94.08  0 [] -nextState(VarCurr,VarNext)|v1887(VarNext)| -v318(VarNext)|v318(VarCurr).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1887(VarNext)|v318(VarNext)| -v318(VarCurr).
% 94.70/94.09  0 [] -v1887(VarNext)| -v318(VarNext)|v1903(VarNext).
% 94.70/94.09  0 [] -v1887(VarNext)|v318(VarNext)| -v1903(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1903(VarNext)|v1901(VarCurr).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1903(VarNext)| -v1901(VarCurr).
% 94.70/94.09  0 [] v1900(VarCurr)| -v1901(VarCurr)|v1904(VarCurr).
% 94.70/94.09  0 [] v1900(VarCurr)|v1901(VarCurr)| -v1904(VarCurr).
% 94.70/94.09  0 [] -v1900(VarCurr)| -v1901(VarCurr)|$F.
% 94.70/94.09  0 [] -v1900(VarCurr)|v1901(VarCurr)| -$F.
% 94.70/94.09  0 [] v320(VarCurr)| -v1904(VarCurr)|$T.
% 94.70/94.09  0 [] v320(VarCurr)|v1904(VarCurr)| -$T.
% 94.70/94.09  0 [] -v320(VarCurr)| -v1904(VarCurr)|$F.
% 94.70/94.09  0 [] -v320(VarCurr)|v1904(VarCurr)| -$F.
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1887(VarNext)|v1888(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1887(VarNext)|v1897(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1887(VarNext)| -v1888(VarNext)| -v1897(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1897(VarNext)|v1895(VarCurr).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1897(VarNext)| -v1895(VarCurr).
% 94.70/94.09  0 [] -v1895(VarCurr)|v1898(VarCurr)|v1900(VarCurr).
% 94.70/94.09  0 [] v1895(VarCurr)| -v1898(VarCurr).
% 94.70/94.09  0 [] v1895(VarCurr)| -v1900(VarCurr).
% 94.70/94.09  0 [] v1900(VarCurr)|v12(VarCurr).
% 94.70/94.09  0 [] -v1900(VarCurr)| -v12(VarCurr).
% 94.70/94.09  0 [] -v1898(VarCurr)|v1899(VarCurr)|v320(VarCurr).
% 94.70/94.09  0 [] v1898(VarCurr)| -v1899(VarCurr).
% 94.70/94.09  0 [] v1898(VarCurr)| -v320(VarCurr).
% 94.70/94.09  0 [] -v1899(VarCurr)|v664(VarCurr).
% 94.70/94.09  0 [] -v1899(VarCurr)|v741(VarCurr).
% 94.70/94.09  0 [] v1899(VarCurr)| -v664(VarCurr)| -v741(VarCurr).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1888(VarNext)|v1889(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1888(VarNext)|v288(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1888(VarNext)| -v1889(VarNext)| -v288(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1889(VarNext)|v1891(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1889(VarNext)| -v1891(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1891(VarNext)|v288(VarCurr).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1891(VarNext)| -v288(VarCurr).
% 94.70/94.09  0 [] -v318(constB0)|$F.
% 94.70/94.09  0 [] v318(constB0)| -$F.
% 94.70/94.09  0 [] -v741(VarCurr)|v1882(VarCurr).
% 94.70/94.09  0 [] -v741(VarCurr)|v875(VarCurr).
% 94.70/94.09  0 [] v741(VarCurr)| -v1882(VarCurr)| -v875(VarCurr).
% 94.70/94.09  0 [] -v1882(VarCurr)|v1883(VarCurr)|v1884(VarCurr).
% 94.70/94.09  0 [] v1882(VarCurr)| -v1883(VarCurr).
% 94.70/94.09  0 [] v1882(VarCurr)| -v1884(VarCurr).
% 94.70/94.09  0 [] -v1884(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.70/94.09  0 [] -v1884(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.70/94.09  0 [] -v1884(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.70/94.09  0 [] -v1884(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.70/94.09  0 [] -v1884(VarCurr)| -v743(VarCurr,bitIndex1)|$T.
% 94.70/94.09  0 [] -v1884(VarCurr)|v743(VarCurr,bitIndex1)| -$T.
% 94.70/94.09  0 [] -v1884(VarCurr)| -v743(VarCurr,bitIndex0)|$T.
% 94.70/94.09  0 [] -v1884(VarCurr)|v743(VarCurr,bitIndex0)| -$T.
% 94.70/94.09  0 [] v1884(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1884(VarCurr)|v743(VarCurr,bitIndex3)|$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1884(VarCurr)| -v743(VarCurr,bitIndex3)| -$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1884(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -b0111(bitIndex3).
% 94.70/94.09  0 [] b0111(bitIndex2).
% 94.70/94.09  0 [] b0111(bitIndex1).
% 94.70/94.09  0 [] b0111(bitIndex0).
% 94.70/94.09  0 [] -v1883(VarCurr)| -v743(VarCurr,bitIndex3)|$F.
% 94.70/94.09  0 [] -v1883(VarCurr)|v743(VarCurr,bitIndex3)| -$F.
% 94.70/94.09  0 [] -v1883(VarCurr)| -v743(VarCurr,bitIndex2)|$T.
% 94.70/94.09  0 [] -v1883(VarCurr)|v743(VarCurr,bitIndex2)| -$T.
% 94.70/94.09  0 [] -v1883(VarCurr)| -v743(VarCurr,bitIndex1)|$T.
% 94.70/94.09  0 [] -v1883(VarCurr)|v743(VarCurr,bitIndex1)| -$T.
% 94.70/94.09  0 [] -v1883(VarCurr)| -v743(VarCurr,bitIndex0)|$F.
% 94.70/94.09  0 [] -v1883(VarCurr)|v743(VarCurr,bitIndex0)| -$F.
% 94.70/94.09  0 [] v1883(VarCurr)|v743(VarCurr,bitIndex3)|$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1883(VarCurr)|v743(VarCurr,bitIndex3)|$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)|v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1883(VarCurr)| -v743(VarCurr,bitIndex3)| -$F|v743(VarCurr,bitIndex2)|$T|v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1883(VarCurr)| -v743(VarCurr,bitIndex3)| -$F| -v743(VarCurr,bitIndex2)| -$T| -v743(VarCurr,bitIndex1)| -v743(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -v743(VarCurr,bitIndex3)|v745(VarCurr,bitIndex66).
% 94.70/94.09  0 [] v743(VarCurr,bitIndex3)| -v745(VarCurr,bitIndex66).
% 94.70/94.09  0 [] -v743(VarCurr,bitIndex2)|v745(VarCurr,bitIndex65).
% 94.70/94.09  0 [] v743(VarCurr,bitIndex2)| -v745(VarCurr,bitIndex65).
% 94.70/94.09  0 [] -v743(VarCurr,bitIndex1)|v745(VarCurr,bitIndex64).
% 94.70/94.09  0 [] v743(VarCurr,bitIndex1)| -v745(VarCurr,bitIndex64).
% 94.70/94.09  0 [] -v743(VarCurr,bitIndex0)|v745(VarCurr,bitIndex63).
% 94.70/94.09  0 [] v743(VarCurr,bitIndex0)| -v745(VarCurr,bitIndex63).
% 94.70/94.09  0 [] -range_66_63(B)| -v745(VarCurr,B)|v747(VarCurr,B).
% 94.70/94.09  0 [] -range_66_63(B)|v745(VarCurr,B)| -v747(VarCurr,B).
% 94.70/94.09  0 [] -range_66_63(B)| -v747(VarCurr,B)|v867(VarCurr,B).
% 94.70/94.09  0 [] -range_66_63(B)|v747(VarCurr,B)| -v867(VarCurr,B).
% 94.70/94.09  0 [] -range_66_63(B)|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B.
% 94.70/94.09  0 [] range_66_63(B)|bitIndex63!=B.
% 94.70/94.09  0 [] range_66_63(B)|bitIndex64!=B.
% 94.70/94.09  0 [] range_66_63(B)|bitIndex65!=B.
% 94.70/94.09  0 [] range_66_63(B)|bitIndex66!=B.
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1869(VarNext)| -range_3_0(B)| -v869(VarNext,B)|v869(VarCurr,B).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1869(VarNext)| -range_3_0(B)|v869(VarNext,B)| -v869(VarCurr,B).
% 94.70/94.09  0 [] -v1869(VarNext)| -range_3_0(B)| -v869(VarNext,B)|v1877(VarNext,B).
% 94.70/94.09  0 [] -v1869(VarNext)| -range_3_0(B)|v869(VarNext,B)| -v1877(VarNext,B).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v1877(VarNext,B)|v1875(VarCurr,B).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v1877(VarNext,B)| -v1875(VarCurr,B).
% 94.70/94.09  0 [] v830(VarCurr)| -range_3_0(B)| -v1875(VarCurr,B)|v871(VarCurr,B).
% 94.70/94.09  0 [] v830(VarCurr)| -range_3_0(B)|v1875(VarCurr,B)| -v871(VarCurr,B).
% 94.70/94.09  0 [] -v830(VarCurr)| -range_3_0(B)| -v1875(VarCurr,B)|$F.
% 94.70/94.09  0 [] -v830(VarCurr)| -range_3_0(B)|v1875(VarCurr,B)| -$F.
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1869(VarNext)|v1870(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1869(VarNext)| -v1870(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1870(VarNext)|v1872(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1870(VarNext)|v751(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1870(VarNext)| -v1872(VarNext)| -v751(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1872(VarNext)|v823(VarNext).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)| -v1872(VarNext)| -v823(VarNext).
% 94.70/94.09  0 [] v873(VarCurr)| -range_3_0(B)| -v871(VarCurr,B)|v869(VarCurr,B).
% 94.70/94.09  0 [] v873(VarCurr)| -range_3_0(B)|v871(VarCurr,B)| -v869(VarCurr,B).
% 94.70/94.09  0 [] -v873(VarCurr)| -range_3_0(B)| -v871(VarCurr,B)|v1846(VarCurr,B).
% 94.70/94.09  0 [] -v873(VarCurr)| -range_3_0(B)|v871(VarCurr,B)| -v1846(VarCurr,B).
% 94.70/94.09  0 [] v1847(VarCurr)| -range_3_0(B)| -v1846(VarCurr,B)|v1848(VarCurr,B).
% 94.70/94.09  0 [] v1847(VarCurr)| -range_3_0(B)|v1846(VarCurr,B)| -v1848(VarCurr,B).
% 94.70/94.09  0 [] -v1847(VarCurr)| -range_3_0(B)| -v1846(VarCurr,B)|$F.
% 94.70/94.09  0 [] -v1847(VarCurr)| -range_3_0(B)|v1846(VarCurr,B)| -$F.
% 94.70/94.09  0 [] -v1848(VarCurr,bitIndex0)|v1864(VarCurr).
% 94.70/94.09  0 [] v1848(VarCurr,bitIndex0)| -v1864(VarCurr).
% 94.70/94.09  0 [] -v1848(VarCurr,bitIndex1)|v1862(VarCurr).
% 94.70/94.09  0 [] v1848(VarCurr,bitIndex1)| -v1862(VarCurr).
% 94.70/94.09  0 [] -v1848(VarCurr,bitIndex2)|v1857(VarCurr).
% 94.70/94.09  0 [] v1848(VarCurr,bitIndex2)| -v1857(VarCurr).
% 94.70/94.09  0 [] -v1848(VarCurr,bitIndex3)|v1850(VarCurr).
% 94.70/94.09  0 [] v1848(VarCurr,bitIndex3)| -v1850(VarCurr).
% 94.70/94.09  0 [] -v1862(VarCurr)|v1863(VarCurr).
% 94.70/94.09  0 [] -v1862(VarCurr)|v1866(VarCurr).
% 94.70/94.09  0 [] v1862(VarCurr)| -v1863(VarCurr)| -v1866(VarCurr).
% 94.70/94.09  0 [] -v1866(VarCurr)|v869(VarCurr,bitIndex0)|v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] v1866(VarCurr)| -v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1866(VarCurr)| -v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] -v1863(VarCurr)|v1864(VarCurr)|v1865(VarCurr).
% 94.70/94.09  0 [] v1863(VarCurr)| -v1864(VarCurr).
% 94.70/94.09  0 [] v1863(VarCurr)| -v1865(VarCurr).
% 94.70/94.09  0 [] v1865(VarCurr)|v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] -v1865(VarCurr)| -v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] v1864(VarCurr)|v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -v1864(VarCurr)| -v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -v1857(VarCurr)|v1858(VarCurr).
% 94.70/94.09  0 [] -v1857(VarCurr)|v1861(VarCurr).
% 94.70/94.09  0 [] v1857(VarCurr)| -v1858(VarCurr)| -v1861(VarCurr).
% 94.70/94.09  0 [] -v1861(VarCurr)|v1854(VarCurr)|v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] v1861(VarCurr)| -v1854(VarCurr).
% 94.70/94.09  0 [] v1861(VarCurr)| -v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] -v1858(VarCurr)|v1859(VarCurr)|v1860(VarCurr).
% 94.70/94.09  0 [] v1858(VarCurr)| -v1859(VarCurr).
% 94.70/94.09  0 [] v1858(VarCurr)| -v1860(VarCurr).
% 94.70/94.09  0 [] v1860(VarCurr)|v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] -v1860(VarCurr)| -v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] v1859(VarCurr)|v1854(VarCurr).
% 94.70/94.09  0 [] -v1859(VarCurr)| -v1854(VarCurr).
% 94.70/94.09  0 [] -v1850(VarCurr)|v1851(VarCurr).
% 94.70/94.09  0 [] -v1850(VarCurr)|v1856(VarCurr).
% 94.70/94.09  0 [] v1850(VarCurr)| -v1851(VarCurr)| -v1856(VarCurr).
% 94.70/94.09  0 [] -v1856(VarCurr)|v1853(VarCurr)|v869(VarCurr,bitIndex3).
% 94.70/94.09  0 [] v1856(VarCurr)| -v1853(VarCurr).
% 94.70/94.09  0 [] v1856(VarCurr)| -v869(VarCurr,bitIndex3).
% 94.70/94.09  0 [] -v1851(VarCurr)|v1852(VarCurr)|v1855(VarCurr).
% 94.70/94.09  0 [] v1851(VarCurr)| -v1852(VarCurr).
% 94.70/94.09  0 [] v1851(VarCurr)| -v1855(VarCurr).
% 94.70/94.09  0 [] v1855(VarCurr)|v869(VarCurr,bitIndex3).
% 94.70/94.09  0 [] -v1855(VarCurr)| -v869(VarCurr,bitIndex3).
% 94.70/94.09  0 [] v1852(VarCurr)|v1853(VarCurr).
% 94.70/94.09  0 [] -v1852(VarCurr)| -v1853(VarCurr).
% 94.70/94.09  0 [] -v1853(VarCurr)|v1854(VarCurr).
% 94.70/94.09  0 [] -v1853(VarCurr)|v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] v1853(VarCurr)| -v1854(VarCurr)| -v869(VarCurr,bitIndex2).
% 94.70/94.09  0 [] -v1854(VarCurr)|v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -v1854(VarCurr)|v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] v1854(VarCurr)| -v869(VarCurr,bitIndex0)| -v869(VarCurr,bitIndex1).
% 94.70/94.09  0 [] -v1847(VarCurr)| -v869(VarCurr,bitIndex3)|$T.
% 94.70/94.09  0 [] -v1847(VarCurr)|v869(VarCurr,bitIndex3)| -$T.
% 94.70/94.09  0 [] -v1847(VarCurr)| -v869(VarCurr,bitIndex2)|$T.
% 94.70/94.09  0 [] -v1847(VarCurr)|v869(VarCurr,bitIndex2)| -$T.
% 94.70/94.09  0 [] -v1847(VarCurr)| -v869(VarCurr,bitIndex1)|$T.
% 94.70/94.09  0 [] -v1847(VarCurr)|v869(VarCurr,bitIndex1)| -$T.
% 94.70/94.09  0 [] -v1847(VarCurr)| -v869(VarCurr,bitIndex0)|$T.
% 94.70/94.09  0 [] -v1847(VarCurr)|v869(VarCurr,bitIndex0)| -$T.
% 94.70/94.09  0 [] v1847(VarCurr)|v869(VarCurr,bitIndex3)|$T|v869(VarCurr,bitIndex2)|v869(VarCurr,bitIndex1)|v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] v1847(VarCurr)| -v869(VarCurr,bitIndex3)| -$T| -v869(VarCurr,bitIndex2)| -v869(VarCurr,bitIndex1)| -v869(VarCurr,bitIndex0).
% 94.70/94.09  0 [] -v873(VarCurr)|v875(VarCurr).
% 94.70/94.09  0 [] v873(VarCurr)| -v875(VarCurr).
% 94.70/94.09  0 [] -v875(VarCurr)|v877(VarCurr).
% 94.70/94.09  0 [] v875(VarCurr)| -v877(VarCurr).
% 94.70/94.09  0 [] -v877(VarCurr)|v879(VarCurr)|v1843(VarCurr).
% 94.70/94.09  0 [] v877(VarCurr)| -v879(VarCurr).
% 94.70/94.09  0 [] v877(VarCurr)| -v1843(VarCurr).
% 94.70/94.09  0 [] -v1843(VarCurr)|v31(VarCurr,bitIndex4).
% 94.70/94.09  0 [] v1843(VarCurr)| -v31(VarCurr,bitIndex4).
% 94.70/94.09  0 [] -v879(VarCurr)|v36(VarCurr,bitIndex6).
% 94.70/94.09  0 [] v879(VarCurr)| -v36(VarCurr,bitIndex6).
% 94.70/94.09  0 [] v1831(VarCurr)| -v36(VarCurr,bitIndex6)|$F.
% 94.70/94.09  0 [] v1831(VarCurr)|v36(VarCurr,bitIndex6)| -$F.
% 94.70/94.09  0 [] -v1831(VarCurr)| -v36(VarCurr,bitIndex6)|$T.
% 94.70/94.09  0 [] -v1831(VarCurr)|v36(VarCurr,bitIndex6)| -$T.
% 94.70/94.09  0 [] -v1831(VarCurr)|v1832(VarCurr)|v1840(VarCurr).
% 94.70/94.09  0 [] v1831(VarCurr)| -v1832(VarCurr).
% 94.70/94.09  0 [] v1831(VarCurr)| -v1840(VarCurr).
% 94.70/94.09  0 [] -v1840(VarCurr)|v1841(VarCurr).
% 94.70/94.09  0 [] -v1840(VarCurr)|v1821(VarCurr).
% 94.70/94.09  0 [] v1840(VarCurr)| -v1841(VarCurr)| -v1821(VarCurr).
% 94.70/94.09  0 [] v1841(VarCurr)|v38(VarCurr).
% 94.70/94.09  0 [] -v1841(VarCurr)| -v38(VarCurr).
% 94.70/94.09  0 [] -v1832(VarCurr)|v1833(VarCurr)|v1838(VarCurr).
% 94.70/94.09  0 [] v1832(VarCurr)| -v1833(VarCurr).
% 94.70/94.09  0 [] v1832(VarCurr)| -v1838(VarCurr).
% 94.70/94.09  0 [] -v1838(VarCurr)|v1839(VarCurr).
% 94.70/94.09  0 [] -v1838(VarCurr)|v1360(VarCurr).
% 94.70/94.09  0 [] v1838(VarCurr)| -v1839(VarCurr)| -v1360(VarCurr).
% 94.70/94.09  0 [] -v1839(VarCurr)|v1342(VarCurr).
% 94.70/94.09  0 [] -v1839(VarCurr)|v1812(VarCurr).
% 94.70/94.09  0 [] v1839(VarCurr)| -v1342(VarCurr)| -v1812(VarCurr).
% 94.70/94.09  0 [] -v1833(VarCurr)|v1834(VarCurr)|v1836(VarCurr).
% 94.70/94.09  0 [] v1833(VarCurr)| -v1834(VarCurr).
% 94.70/94.09  0 [] v1833(VarCurr)| -v1836(VarCurr).
% 94.70/94.09  0 [] -v1836(VarCurr)|v1837(VarCurr).
% 94.70/94.09  0 [] -v1836(VarCurr)|v1355(VarCurr).
% 94.70/94.09  0 [] v1836(VarCurr)| -v1837(VarCurr)| -v1355(VarCurr).
% 94.70/94.09  0 [] -v1837(VarCurr)|v1342(VarCurr).
% 94.70/94.09  0 [] -v1837(VarCurr)|v1812(VarCurr).
% 94.70/94.09  0 [] v1837(VarCurr)| -v1342(VarCurr)| -v1812(VarCurr).
% 94.70/94.09  0 [] -v1834(VarCurr)|v1835(VarCurr).
% 94.70/94.09  0 [] -v1834(VarCurr)|v1348(VarCurr).
% 94.70/94.09  0 [] v1834(VarCurr)| -v1835(VarCurr)| -v1348(VarCurr).
% 94.70/94.09  0 [] -v1835(VarCurr)|v1342(VarCurr).
% 94.70/94.09  0 [] -v1835(VarCurr)|v1812(VarCurr).
% 94.70/94.09  0 [] v1835(VarCurr)| -v1342(VarCurr)| -v1812(VarCurr).
% 94.70/94.09  0 [] -v31(VarNext,bitIndex11)|v1823(VarNext,bitIndex10).
% 94.70/94.09  0 [] v31(VarNext,bitIndex11)| -v1823(VarNext,bitIndex10).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.70/94.09  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1823(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)|v1823(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.70/94.10  0 [] -v1824(VarNext)| -range_10_0(B)| -v1823(VarNext,B)|v1253(VarNext,B).
% 94.70/94.10  0 [] -v1824(VarNext)| -range_10_0(B)|v1823(VarNext,B)| -v1253(VarNext,B).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1824(VarNext)|v1825(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1824(VarNext)| -v1825(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1825(VarNext)|v1827(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1825(VarNext)|v1240(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1825(VarNext)| -v1827(VarNext)| -v1240(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1827(VarNext)|v1247(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1827(VarNext)| -v1247(VarNext).
% 94.70/94.10  0 [] v1805(VarCurr)| -v36(VarCurr,bitIndex11)|$F.
% 94.70/94.10  0 [] v1805(VarCurr)|v36(VarCurr,bitIndex11)| -$F.
% 94.70/94.10  0 [] -v1805(VarCurr)| -v36(VarCurr,bitIndex11)|$T.
% 94.70/94.10  0 [] -v1805(VarCurr)|v36(VarCurr,bitIndex11)| -$T.
% 94.70/94.10  0 [] -v1805(VarCurr)|v1806(VarCurr)|v1820(VarCurr).
% 94.70/94.10  0 [] v1805(VarCurr)| -v1806(VarCurr).
% 94.70/94.10  0 [] v1805(VarCurr)| -v1820(VarCurr).
% 94.70/94.10  0 [] -v1820(VarCurr)|v38(VarCurr).
% 94.70/94.10  0 [] -v1820(VarCurr)|v1821(VarCurr).
% 94.70/94.10  0 [] v1820(VarCurr)| -v38(VarCurr)| -v1821(VarCurr).
% 94.70/94.10  0 [] -v1821(VarCurr)| -$T|v31(VarCurr,bitIndex11).
% 94.70/94.10  0 [] -v1821(VarCurr)|$T| -v31(VarCurr,bitIndex11).
% 94.70/94.10  0 [] v1821(VarCurr)|$T|v31(VarCurr,bitIndex11).
% 94.70/94.10  0 [] v1821(VarCurr)| -$T| -v31(VarCurr,bitIndex11).
% 94.70/94.10  0 [] -v1806(VarCurr)|v1807(VarCurr)|v1817(VarCurr).
% 94.70/94.10  0 [] v1806(VarCurr)| -v1807(VarCurr).
% 94.70/94.10  0 [] v1806(VarCurr)| -v1817(VarCurr).
% 94.70/94.10  0 [] -v1817(VarCurr)|v1818(VarCurr).
% 94.70/94.10  0 [] -v1817(VarCurr)|v1323(VarCurr).
% 94.70/94.10  0 [] v1817(VarCurr)| -v1818(VarCurr)| -v1323(VarCurr).
% 94.70/94.10  0 [] -v1818(VarCurr)|v1342(VarCurr).
% 94.70/94.10  0 [] -v1818(VarCurr)|v1812(VarCurr).
% 94.70/94.10  0 [] v1818(VarCurr)| -v1342(VarCurr)| -v1812(VarCurr).
% 94.70/94.10  0 [] -v1807(VarCurr)|v1808(VarCurr)|v1815(VarCurr).
% 94.70/94.10  0 [] v1807(VarCurr)| -v1808(VarCurr).
% 94.70/94.10  0 [] v1807(VarCurr)| -v1815(VarCurr).
% 94.70/94.10  0 [] -v1815(VarCurr)|v1816(VarCurr).
% 94.70/94.10  0 [] -v1815(VarCurr)|v1300(VarCurr).
% 94.70/94.10  0 [] v1815(VarCurr)| -v1816(VarCurr)| -v1300(VarCurr).
% 94.70/94.10  0 [] -v1816(VarCurr)|v1352(VarCurr).
% 94.70/94.10  0 [] -v1816(VarCurr)|v1812(VarCurr).
% 94.70/94.10  0 [] v1816(VarCurr)| -v1352(VarCurr)| -v1812(VarCurr).
% 94.70/94.10  0 [] -v1808(VarCurr)|v1809(VarCurr)|v1813(VarCurr).
% 94.70/94.10  0 [] v1808(VarCurr)| -v1809(VarCurr).
% 94.70/94.10  0 [] v1808(VarCurr)| -v1813(VarCurr).
% 94.70/94.10  0 [] -v1813(VarCurr)|v1814(VarCurr).
% 94.70/94.10  0 [] -v1813(VarCurr)|v1278(VarCurr).
% 94.70/94.10  0 [] v1813(VarCurr)| -v1814(VarCurr)| -v1278(VarCurr).
% 94.70/94.10  0 [] -v1814(VarCurr)|v1352(VarCurr).
% 94.70/94.10  0 [] -v1814(VarCurr)|v1812(VarCurr).
% 94.70/94.10  0 [] v1814(VarCurr)| -v1352(VarCurr)| -v1812(VarCurr).
% 94.70/94.10  0 [] -v1809(VarCurr)|v1810(VarCurr).
% 94.70/94.10  0 [] -v1809(VarCurr)|v1238(VarCurr).
% 94.70/94.10  0 [] v1809(VarCurr)| -v1810(VarCurr)| -v1238(VarCurr).
% 94.70/94.10  0 [] -v1810(VarCurr)|v1352(VarCurr).
% 94.70/94.10  0 [] -v1810(VarCurr)|v1812(VarCurr).
% 94.70/94.10  0 [] v1810(VarCurr)| -v1352(VarCurr)| -v1812(VarCurr).
% 94.70/94.10  0 [] v1812(VarCurr)|v1168(VarCurr).
% 94.70/94.10  0 [] -v1812(VarCurr)| -v1168(VarCurr).
% 94.70/94.10  0 [] -v907(VarCurr)|v909(VarCurr).
% 94.70/94.10  0 [] -v907(VarCurr)|v1150(VarCurr).
% 94.70/94.10  0 [] v907(VarCurr)| -v909(VarCurr)| -v1150(VarCurr).
% 94.70/94.10  0 [] -v909(VarCurr)|v911(VarCurr).
% 94.70/94.10  0 [] v909(VarCurr)| -v911(VarCurr).
% 94.70/94.10  0 [] -v911(VarCurr)|v913(VarCurr).
% 94.70/94.10  0 [] v911(VarCurr)| -v913(VarCurr).
% 94.70/94.10  0 [] -v913(VarCurr)|v1799(VarCurr).
% 94.70/94.10  0 [] -v913(VarCurr)|v1800(VarCurr).
% 94.70/94.10  0 [] v913(VarCurr)| -v1799(VarCurr)| -v1800(VarCurr).
% 94.70/94.10  0 [] v1800(VarCurr)|v1138(VarCurr).
% 94.70/94.10  0 [] -v1800(VarCurr)| -v1138(VarCurr).
% 94.70/94.10  0 [] v1799(VarCurr)|v915(VarCurr,bitIndex1).
% 94.70/94.10  0 [] -v1799(VarCurr)| -v915(VarCurr,bitIndex1).
% 94.70/94.10  0 [] -v915(VarCurr,bitIndex1)|v917(VarCurr,bitIndex1).
% 94.70/94.10  0 [] v915(VarCurr,bitIndex1)| -v917(VarCurr,bitIndex1).
% 94.70/94.10  0 [] -v917(VarCurr,bitIndex1)|v919(VarCurr,bitIndex17).
% 94.70/94.10  0 [] v917(VarCurr,bitIndex1)| -v919(VarCurr,bitIndex17).
% 94.70/94.10  0 [] -v919(VarCurr,bitIndex17)|v921(VarCurr,bitIndex17).
% 94.70/94.10  0 [] v919(VarCurr,bitIndex17)| -v921(VarCurr,bitIndex17).
% 94.70/94.10  0 [] -v921(VarCurr,bitIndex17)|v1017(VarCurr,bitIndex17).
% 94.70/94.10  0 [] v921(VarCurr,bitIndex17)| -v1017(VarCurr,bitIndex17).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1787(VarNext)| -range_3_0(B)| -v1019(VarNext,B)|v1019(VarCurr,B).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1787(VarNext)| -range_3_0(B)|v1019(VarNext,B)| -v1019(VarCurr,B).
% 94.70/94.10  0 [] -v1787(VarNext)| -range_3_0(B)| -v1019(VarNext,B)|v1795(VarNext,B).
% 94.70/94.10  0 [] -v1787(VarNext)| -range_3_0(B)|v1019(VarNext,B)| -v1795(VarNext,B).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v1795(VarNext,B)|v1793(VarCurr,B).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v1795(VarNext,B)| -v1793(VarCurr,B).
% 94.70/94.10  0 [] v991(VarCurr)| -range_3_0(B)| -v1793(VarCurr,B)|v1021(VarCurr,B).
% 94.70/94.10  0 [] v991(VarCurr)| -range_3_0(B)|v1793(VarCurr,B)| -v1021(VarCurr,B).
% 94.70/94.10  0 [] -v991(VarCurr)| -range_3_0(B)| -v1793(VarCurr,B)|$F.
% 94.70/94.10  0 [] -v991(VarCurr)| -range_3_0(B)|v1793(VarCurr,B)| -$F.
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1787(VarNext)|v1788(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1787(VarNext)| -v1788(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1788(VarNext)|v1790(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1788(VarNext)|v925(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1788(VarNext)| -v1790(VarNext)| -v925(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)|v1790(VarNext)|v984(VarNext).
% 94.70/94.10  0 [] -nextState(VarCurr,VarNext)| -v1790(VarNext)| -v984(VarNext).
% 94.70/94.10  0 [] v1023(VarCurr)| -range_3_0(B)| -v1021(VarCurr,B)|v1019(VarCurr,B).
% 94.70/94.10  0 [] v1023(VarCurr)| -range_3_0(B)|v1021(VarCurr,B)| -v1019(VarCurr,B).
% 94.70/94.10  0 [] -v1023(VarCurr)| -range_3_0(B)| -v1021(VarCurr,B)|v1764(VarCurr,B).
% 94.70/94.10  0 [] -v1023(VarCurr)| -range_3_0(B)|v1021(VarCurr,B)| -v1764(VarCurr,B).
% 94.70/94.10  0 [] v1765(VarCurr)| -range_3_0(B)| -v1764(VarCurr,B)|v1766(VarCurr,B).
% 94.70/94.10  0 [] v1765(VarCurr)| -range_3_0(B)|v1764(VarCurr,B)| -v1766(VarCurr,B).
% 94.70/94.10  0 [] -v1765(VarCurr)| -range_3_0(B)| -v1764(VarCurr,B)|$F.
% 94.70/94.10  0 [] -v1765(VarCurr)| -range_3_0(B)|v1764(VarCurr,B)| -$F.
% 94.70/94.10  0 [] -v1766(VarCurr,bitIndex0)|v1782(VarCurr).
% 94.70/94.10  0 [] v1766(VarCurr,bitIndex0)| -v1782(VarCurr).
% 94.70/94.10  0 [] -v1766(VarCurr,bitIndex1)|v1780(VarCurr).
% 94.70/94.10  0 [] v1766(VarCurr,bitIndex1)| -v1780(VarCurr).
% 94.70/94.10  0 [] -v1766(VarCurr,bitIndex2)|v1775(VarCurr).
% 94.70/94.10  0 [] v1766(VarCurr,bitIndex2)| -v1775(VarCurr).
% 94.70/94.10  0 [] -v1766(VarCurr,bitIndex3)|v1768(VarCurr).
% 94.70/94.10  0 [] v1766(VarCurr,bitIndex3)| -v1768(VarCurr).
% 94.70/94.10  0 [] -v1780(VarCurr)|v1781(VarCurr).
% 94.70/94.10  0 [] -v1780(VarCurr)|v1784(VarCurr).
% 94.70/94.10  0 [] v1780(VarCurr)| -v1781(VarCurr)| -v1784(VarCurr).
% 94.70/94.10  0 [] -v1784(VarCurr)|v1019(VarCurr,bitIndex0)|v1019(VarCurr,bitIndex1).
% 94.70/94.10  0 [] v1784(VarCurr)| -v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] v1784(VarCurr)| -v1019(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -v1781(VarCurr)|v1782(VarCurr)|v1783(VarCurr).
% 94.70/94.11  0 [] v1781(VarCurr)| -v1782(VarCurr).
% 94.70/94.11  0 [] v1781(VarCurr)| -v1783(VarCurr).
% 94.70/94.11  0 [] v1783(VarCurr)|v1019(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -v1783(VarCurr)| -v1019(VarCurr,bitIndex1).
% 94.70/94.11  0 [] v1782(VarCurr)|v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1782(VarCurr)| -v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1775(VarCurr)|v1776(VarCurr).
% 94.70/94.11  0 [] -v1775(VarCurr)|v1779(VarCurr).
% 94.70/94.11  0 [] v1775(VarCurr)| -v1776(VarCurr)| -v1779(VarCurr).
% 94.70/94.11  0 [] -v1779(VarCurr)|v1772(VarCurr)|v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] v1779(VarCurr)| -v1772(VarCurr).
% 94.70/94.11  0 [] v1779(VarCurr)| -v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] -v1776(VarCurr)|v1777(VarCurr)|v1778(VarCurr).
% 94.70/94.11  0 [] v1776(VarCurr)| -v1777(VarCurr).
% 94.70/94.11  0 [] v1776(VarCurr)| -v1778(VarCurr).
% 94.70/94.11  0 [] v1778(VarCurr)|v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] -v1778(VarCurr)| -v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] v1777(VarCurr)|v1772(VarCurr).
% 94.70/94.11  0 [] -v1777(VarCurr)| -v1772(VarCurr).
% 94.70/94.11  0 [] -v1768(VarCurr)|v1769(VarCurr).
% 94.70/94.11  0 [] -v1768(VarCurr)|v1774(VarCurr).
% 94.70/94.11  0 [] v1768(VarCurr)| -v1769(VarCurr)| -v1774(VarCurr).
% 94.70/94.11  0 [] -v1774(VarCurr)|v1771(VarCurr)|v1019(VarCurr,bitIndex3).
% 94.70/94.11  0 [] v1774(VarCurr)| -v1771(VarCurr).
% 94.70/94.11  0 [] v1774(VarCurr)| -v1019(VarCurr,bitIndex3).
% 94.70/94.11  0 [] -v1769(VarCurr)|v1770(VarCurr)|v1773(VarCurr).
% 94.70/94.11  0 [] v1769(VarCurr)| -v1770(VarCurr).
% 94.70/94.11  0 [] v1769(VarCurr)| -v1773(VarCurr).
% 94.70/94.11  0 [] v1773(VarCurr)|v1019(VarCurr,bitIndex3).
% 94.70/94.11  0 [] -v1773(VarCurr)| -v1019(VarCurr,bitIndex3).
% 94.70/94.11  0 [] v1770(VarCurr)|v1771(VarCurr).
% 94.70/94.11  0 [] -v1770(VarCurr)| -v1771(VarCurr).
% 94.70/94.11  0 [] -v1771(VarCurr)|v1772(VarCurr).
% 94.70/94.11  0 [] -v1771(VarCurr)|v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] v1771(VarCurr)| -v1772(VarCurr)| -v1019(VarCurr,bitIndex2).
% 94.70/94.11  0 [] -v1772(VarCurr)|v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1772(VarCurr)|v1019(VarCurr,bitIndex1).
% 94.70/94.11  0 [] v1772(VarCurr)| -v1019(VarCurr,bitIndex0)| -v1019(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -v1765(VarCurr)| -v1019(VarCurr,bitIndex3)|$T.
% 94.70/94.11  0 [] -v1765(VarCurr)|v1019(VarCurr,bitIndex3)| -$T.
% 94.70/94.11  0 [] -v1765(VarCurr)| -v1019(VarCurr,bitIndex2)|$T.
% 94.70/94.11  0 [] -v1765(VarCurr)|v1019(VarCurr,bitIndex2)| -$T.
% 94.70/94.11  0 [] -v1765(VarCurr)| -v1019(VarCurr,bitIndex1)|$T.
% 94.70/94.11  0 [] -v1765(VarCurr)|v1019(VarCurr,bitIndex1)| -$T.
% 94.70/94.11  0 [] -v1765(VarCurr)| -v1019(VarCurr,bitIndex0)|$T.
% 94.70/94.11  0 [] -v1765(VarCurr)|v1019(VarCurr,bitIndex0)| -$T.
% 94.70/94.11  0 [] v1765(VarCurr)|v1019(VarCurr,bitIndex3)|$T|v1019(VarCurr,bitIndex2)|v1019(VarCurr,bitIndex1)|v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] v1765(VarCurr)| -v1019(VarCurr,bitIndex3)| -$T| -v1019(VarCurr,bitIndex2)| -v1019(VarCurr,bitIndex1)| -v1019(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1023(VarCurr)|v1025(VarCurr).
% 94.70/94.11  0 [] v1023(VarCurr)| -v1025(VarCurr).
% 94.70/94.11  0 [] -v1025(VarCurr)|v1027(VarCurr).
% 94.70/94.11  0 [] v1025(VarCurr)| -v1027(VarCurr).
% 94.70/94.11  0 [] -v1027(VarCurr)|v1761(VarCurr)|v1160(VarCurr).
% 94.70/94.11  0 [] v1027(VarCurr)| -v1761(VarCurr).
% 94.70/94.11  0 [] v1027(VarCurr)| -v1160(VarCurr).
% 94.70/94.11  0 [] -v1761(VarCurr)|v1762(VarCurr)|v85(VarCurr).
% 94.70/94.11  0 [] v1761(VarCurr)| -v1762(VarCurr).
% 94.70/94.11  0 [] v1761(VarCurr)| -v85(VarCurr).
% 94.70/94.11  0 [] -v1762(VarCurr)|v1029(VarCurr)|v1148(VarCurr).
% 94.70/94.11  0 [] v1762(VarCurr)| -v1029(VarCurr).
% 94.70/94.11  0 [] v1762(VarCurr)| -v1148(VarCurr).
% 94.70/94.11  0 [] -v1160(VarCurr)|v31(VarCurr,bitIndex1).
% 94.70/94.11  0 [] v1160(VarCurr)| -v31(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -v31(VarNext,bitIndex1)|v1753(VarNext,bitIndex0).
% 94.70/94.11  0 [] v31(VarNext,bitIndex1)| -v1753(VarNext,bitIndex0).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1753(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)|v1753(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.70/94.11  0 [] -v1754(VarNext)| -range_10_0(B)| -v1753(VarNext,B)|v1253(VarNext,B).
% 94.70/94.11  0 [] -v1754(VarNext)| -range_10_0(B)|v1753(VarNext,B)| -v1253(VarNext,B).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1754(VarNext)|v1755(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1754(VarNext)| -v1755(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1755(VarNext)|v1757(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1755(VarNext)|v1240(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1755(VarNext)| -v1757(VarNext)| -v1240(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1757(VarNext)|v1247(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1757(VarNext)| -v1247(VarNext).
% 94.70/94.11  0 [] v1730(VarCurr)| -v36(VarCurr,bitIndex1)|$F.
% 94.70/94.11  0 [] v1730(VarCurr)|v36(VarCurr,bitIndex1)| -$F.
% 94.70/94.11  0 [] -v1730(VarCurr)| -v36(VarCurr,bitIndex1)|$T.
% 94.70/94.11  0 [] -v1730(VarCurr)|v36(VarCurr,bitIndex1)| -$T.
% 94.70/94.11  0 [] -v1730(VarCurr)|v1731(VarCurr)|v1750(VarCurr).
% 94.70/94.11  0 [] v1730(VarCurr)| -v1731(VarCurr).
% 94.70/94.11  0 [] v1730(VarCurr)| -v1750(VarCurr).
% 94.70/94.11  0 [] -v1750(VarCurr)|v1751(VarCurr).
% 94.70/94.11  0 [] -v1750(VarCurr)|v1323(VarCurr).
% 94.70/94.11  0 [] v1750(VarCurr)| -v1751(VarCurr)| -v1323(VarCurr).
% 94.70/94.11  0 [] -v1751(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1751(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1751(VarCurr)| -v1677(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1731(VarCurr)|v1732(VarCurr)|v1748(VarCurr).
% 94.70/94.11  0 [] v1731(VarCurr)| -v1732(VarCurr).
% 94.70/94.11  0 [] v1731(VarCurr)| -v1748(VarCurr).
% 94.70/94.11  0 [] -v1748(VarCurr)|v1749(VarCurr).
% 94.70/94.11  0 [] -v1748(VarCurr)|v1300(VarCurr).
% 94.70/94.11  0 [] v1748(VarCurr)| -v1749(VarCurr)| -v1300(VarCurr).
% 94.70/94.11  0 [] -v1749(VarCurr)|v1689(VarCurr).
% 94.70/94.11  0 [] -v1749(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1749(VarCurr)| -v1689(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1732(VarCurr)|v1733(VarCurr)|v1746(VarCurr).
% 94.70/94.11  0 [] v1732(VarCurr)| -v1733(VarCurr).
% 94.70/94.11  0 [] v1732(VarCurr)| -v1746(VarCurr).
% 94.70/94.11  0 [] -v1746(VarCurr)|v1747(VarCurr).
% 94.70/94.11  0 [] -v1746(VarCurr)|v1360(VarCurr).
% 94.70/94.11  0 [] v1746(VarCurr)| -v1747(VarCurr)| -v1360(VarCurr).
% 94.70/94.11  0 [] -v1747(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1747(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1747(VarCurr)| -v1677(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1733(VarCurr)|v1734(VarCurr)|v1744(VarCurr).
% 94.70/94.11  0 [] v1733(VarCurr)| -v1734(VarCurr).
% 94.70/94.11  0 [] v1733(VarCurr)| -v1744(VarCurr).
% 94.70/94.11  0 [] -v1744(VarCurr)|v1745(VarCurr).
% 94.70/94.11  0 [] -v1744(VarCurr)|v1278(VarCurr).
% 94.70/94.11  0 [] v1744(VarCurr)| -v1745(VarCurr)| -v1278(VarCurr).
% 94.70/94.11  0 [] -v1745(VarCurr)|v1689(VarCurr).
% 94.70/94.11  0 [] -v1745(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1745(VarCurr)| -v1689(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1734(VarCurr)|v1735(VarCurr)|v1742(VarCurr).
% 94.70/94.11  0 [] v1734(VarCurr)| -v1735(VarCurr).
% 94.70/94.11  0 [] v1734(VarCurr)| -v1742(VarCurr).
% 94.70/94.11  0 [] -v1742(VarCurr)|v1743(VarCurr).
% 94.70/94.11  0 [] -v1742(VarCurr)|v1355(VarCurr).
% 94.70/94.11  0 [] v1742(VarCurr)| -v1743(VarCurr)| -v1355(VarCurr).
% 94.70/94.11  0 [] -v1743(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1743(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1743(VarCurr)| -v1677(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1735(VarCurr)|v1736(VarCurr)|v1739(VarCurr).
% 94.70/94.11  0 [] v1735(VarCurr)| -v1736(VarCurr).
% 94.70/94.11  0 [] v1735(VarCurr)| -v1739(VarCurr).
% 94.70/94.11  0 [] -v1739(VarCurr)|v1740(VarCurr).
% 94.70/94.11  0 [] -v1739(VarCurr)|v1238(VarCurr).
% 94.70/94.11  0 [] v1739(VarCurr)| -v1740(VarCurr)| -v1238(VarCurr).
% 94.70/94.11  0 [] -v1740(VarCurr)|v1689(VarCurr).
% 94.70/94.11  0 [] -v1740(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1740(VarCurr)| -v1689(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -v1736(VarCurr)|v1737(VarCurr).
% 94.70/94.11  0 [] -v1736(VarCurr)|v1348(VarCurr).
% 94.70/94.11  0 [] v1736(VarCurr)| -v1737(VarCurr)| -v1348(VarCurr).
% 94.70/94.11  0 [] -v1737(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1737(VarCurr)|v907(VarCurr).
% 94.70/94.11  0 [] v1737(VarCurr)| -v1677(VarCurr)| -v907(VarCurr).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1717(VarNext)| -v31(VarNext,bitIndex0)|v31(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1717(VarNext)|v31(VarNext,bitIndex0)| -v31(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1717(VarNext)| -v31(VarNext,bitIndex0)|v1725(VarNext).
% 94.70/94.11  0 [] -v1717(VarNext)|v31(VarNext,bitIndex0)| -v1725(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1725(VarNext)|v1723(VarCurr).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1725(VarNext)| -v1723(VarCurr).
% 94.70/94.11  0 [] v1254(VarCurr)| -v1723(VarCurr)|v36(VarCurr,bitIndex0).
% 94.70/94.11  0 [] v1254(VarCurr)|v1723(VarCurr)| -v36(VarCurr,bitIndex0).
% 94.70/94.11  0 [] -v1254(VarCurr)| -v1723(VarCurr)|$T.
% 94.70/94.11  0 [] -v1254(VarCurr)|v1723(VarCurr)| -$T.
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1717(VarNext)|v1718(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1717(VarNext)| -v1718(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1718(VarNext)|v1720(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1718(VarNext)|v1240(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1718(VarNext)| -v1720(VarNext)| -v1240(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)|v1720(VarNext)|v1247(VarNext).
% 94.70/94.11  0 [] -nextState(VarCurr,VarNext)| -v1720(VarNext)| -v1247(VarNext).
% 94.70/94.11  0 [] v1660(VarCurr)| -v36(VarCurr,bitIndex0)|$F.
% 94.70/94.11  0 [] v1660(VarCurr)|v36(VarCurr,bitIndex0)| -$F.
% 94.70/94.11  0 [] -v1660(VarCurr)| -v36(VarCurr,bitIndex0)|$T.
% 94.70/94.11  0 [] -v1660(VarCurr)|v36(VarCurr,bitIndex0)| -$T.
% 94.70/94.11  0 [] -v1660(VarCurr)|v1661(VarCurr)|v1711(VarCurr).
% 94.70/94.11  0 [] v1660(VarCurr)| -v1661(VarCurr).
% 94.70/94.11  0 [] v1660(VarCurr)| -v1711(VarCurr).
% 94.70/94.11  0 [] -v1711(VarCurr)|v1712(VarCurr).
% 94.70/94.11  0 [] -v1711(VarCurr)|v1323(VarCurr).
% 94.70/94.11  0 [] v1711(VarCurr)| -v1712(VarCurr)| -v1323(VarCurr).
% 94.70/94.11  0 [] -v1712(VarCurr)|v1713(VarCurr)|v1714(VarCurr).
% 94.70/94.11  0 [] v1712(VarCurr)| -v1713(VarCurr).
% 94.70/94.11  0 [] v1712(VarCurr)| -v1714(VarCurr).
% 94.70/94.11  0 [] -v1714(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1714(VarCurr)|v1682(VarCurr).
% 94.70/94.11  0 [] v1714(VarCurr)| -v1677(VarCurr)| -v1682(VarCurr).
% 94.70/94.11  0 [] -v1713(VarCurr)|v1671(VarCurr).
% 94.70/94.11  0 [] v1713(VarCurr)| -v1671(VarCurr).
% 94.70/94.11  0 [] -v1661(VarCurr)|v1662(VarCurr)|v1707(VarCurr).
% 94.70/94.11  0 [] v1661(VarCurr)| -v1662(VarCurr).
% 94.70/94.11  0 [] v1661(VarCurr)| -v1707(VarCurr).
% 94.70/94.11  0 [] -v1707(VarCurr)|v1708(VarCurr).
% 94.70/94.11  0 [] -v1707(VarCurr)|v1300(VarCurr).
% 94.70/94.11  0 [] v1707(VarCurr)| -v1708(VarCurr)| -v1300(VarCurr).
% 94.70/94.11  0 [] -v1708(VarCurr)|v1709(VarCurr)|v1710(VarCurr).
% 94.70/94.11  0 [] v1708(VarCurr)| -v1709(VarCurr).
% 94.70/94.11  0 [] v1708(VarCurr)| -v1710(VarCurr).
% 94.70/94.11  0 [] -v1710(VarCurr)|v1689(VarCurr).
% 94.70/94.11  0 [] -v1710(VarCurr)|v1682(VarCurr).
% 94.70/94.11  0 [] v1710(VarCurr)| -v1689(VarCurr)| -v1682(VarCurr).
% 94.70/94.11  0 [] -v1709(VarCurr)|v1671(VarCurr).
% 94.70/94.11  0 [] -v1709(VarCurr)|v1180(VarCurr).
% 94.70/94.11  0 [] v1709(VarCurr)| -v1671(VarCurr)| -v1180(VarCurr).
% 94.70/94.11  0 [] -v1662(VarCurr)|v1663(VarCurr)|v1701(VarCurr).
% 94.70/94.11  0 [] v1662(VarCurr)| -v1663(VarCurr).
% 94.70/94.11  0 [] v1662(VarCurr)| -v1701(VarCurr).
% 94.70/94.11  0 [] -v1701(VarCurr)|v1702(VarCurr).
% 94.70/94.11  0 [] -v1701(VarCurr)|v1360(VarCurr).
% 94.70/94.11  0 [] v1701(VarCurr)| -v1702(VarCurr)| -v1360(VarCurr).
% 94.70/94.11  0 [] -v1702(VarCurr)|v1703(VarCurr)|v1706(VarCurr).
% 94.70/94.11  0 [] v1702(VarCurr)| -v1703(VarCurr).
% 94.70/94.11  0 [] v1702(VarCurr)| -v1706(VarCurr).
% 94.70/94.11  0 [] -v1706(VarCurr)|v1677(VarCurr).
% 94.70/94.11  0 [] -v1706(VarCurr)|v1682(VarCurr).
% 94.70/94.11  0 [] v1706(VarCurr)| -v1677(VarCurr)| -v1682(VarCurr).
% 94.70/94.11  0 [] -v1703(VarCurr)|v1704(VarCurr)|v1705(VarCurr).
% 94.70/94.11  0 [] v1703(VarCurr)| -v1704(VarCurr).
% 94.70/94.11  0 [] v1703(VarCurr)| -v1705(VarCurr).
% 94.70/94.11  0 [] -v1705(VarCurr)|v1671(VarCurr).
% 94.70/94.11  0 [] v1705(VarCurr)| -v1671(VarCurr).
% 94.70/94.11  0 [] -v1704(VarCurr)|v38(VarCurr).
% 94.70/94.11  0 [] v1704(VarCurr)| -v38(VarCurr).
% 94.70/94.11  0 [] -v1663(VarCurr)|v1664(VarCurr)|v1697(VarCurr).
% 94.70/94.11  0 [] v1663(VarCurr)| -v1664(VarCurr).
% 94.70/94.11  0 [] v1663(VarCurr)| -v1697(VarCurr).
% 94.70/94.12  0 [] -v1697(VarCurr)|v1698(VarCurr).
% 94.70/94.12  0 [] -v1697(VarCurr)|v1278(VarCurr).
% 94.70/94.12  0 [] v1697(VarCurr)| -v1698(VarCurr)| -v1278(VarCurr).
% 94.70/94.12  0 [] -v1698(VarCurr)|v1699(VarCurr)|v1700(VarCurr).
% 94.70/94.12  0 [] v1698(VarCurr)| -v1699(VarCurr).
% 94.70/94.12  0 [] v1698(VarCurr)| -v1700(VarCurr).
% 94.70/94.12  0 [] -v1700(VarCurr)|v1689(VarCurr).
% 94.70/94.12  0 [] -v1700(VarCurr)|v1682(VarCurr).
% 94.70/94.12  0 [] v1700(VarCurr)| -v1689(VarCurr)| -v1682(VarCurr).
% 94.70/94.12  0 [] -v1699(VarCurr)|v1671(VarCurr).
% 94.70/94.12  0 [] -v1699(VarCurr)|v1180(VarCurr).
% 94.70/94.12  0 [] v1699(VarCurr)| -v1671(VarCurr)| -v1180(VarCurr).
% 94.70/94.12  0 [] -v1664(VarCurr)|v1665(VarCurr)|v1691(VarCurr).
% 94.70/94.12  0 [] v1664(VarCurr)| -v1665(VarCurr).
% 94.70/94.12  0 [] v1664(VarCurr)| -v1691(VarCurr).
% 94.70/94.12  0 [] -v1691(VarCurr)|v1692(VarCurr).
% 94.70/94.12  0 [] -v1691(VarCurr)|v1355(VarCurr).
% 94.70/94.12  0 [] v1691(VarCurr)| -v1692(VarCurr)| -v1355(VarCurr).
% 94.70/94.12  0 [] -v1692(VarCurr)|v1693(VarCurr)|v1696(VarCurr).
% 94.70/94.12  0 [] v1692(VarCurr)| -v1693(VarCurr).
% 94.70/94.12  0 [] v1692(VarCurr)| -v1696(VarCurr).
% 94.70/94.12  0 [] -v1696(VarCurr)|v1677(VarCurr).
% 94.70/94.12  0 [] -v1696(VarCurr)|v1682(VarCurr).
% 94.70/94.12  0 [] v1696(VarCurr)| -v1677(VarCurr)| -v1682(VarCurr).
% 94.70/94.12  0 [] -v1693(VarCurr)|v1694(VarCurr)|v1695(VarCurr).
% 94.70/94.12  0 [] v1693(VarCurr)| -v1694(VarCurr).
% 94.70/94.12  0 [] v1693(VarCurr)| -v1695(VarCurr).
% 94.70/94.12  0 [] -v1695(VarCurr)|v1671(VarCurr).
% 94.70/94.12  0 [] v1695(VarCurr)| -v1671(VarCurr).
% 94.70/94.12  0 [] -v1694(VarCurr)|v38(VarCurr).
% 94.70/94.12  0 [] v1694(VarCurr)| -v38(VarCurr).
% 94.70/94.12  0 [] -v1665(VarCurr)|v1666(VarCurr)|v1683(VarCurr).
% 94.70/94.12  0 [] v1665(VarCurr)| -v1666(VarCurr).
% 94.70/94.12  0 [] v1665(VarCurr)| -v1683(VarCurr).
% 94.70/94.12  0 [] -v1683(VarCurr)|v1684(VarCurr).
% 94.70/94.12  0 [] -v1683(VarCurr)|v1238(VarCurr).
% 94.70/94.12  0 [] v1683(VarCurr)| -v1684(VarCurr)| -v1238(VarCurr).
% 94.70/94.12  0 [] -v1684(VarCurr)|v1685(VarCurr)|v1687(VarCurr).
% 94.70/94.12  0 [] v1684(VarCurr)| -v1685(VarCurr).
% 94.70/94.12  0 [] v1684(VarCurr)| -v1687(VarCurr).
% 94.70/94.12  0 [] -v1687(VarCurr)|v1689(VarCurr).
% 94.70/94.12  0 [] -v1687(VarCurr)|v1682(VarCurr).
% 94.70/94.12  0 [] v1687(VarCurr)| -v1689(VarCurr)| -v1682(VarCurr).
% 94.70/94.12  0 [] -v1689(VarCurr)|v1690(VarCurr).
% 94.70/94.12  0 [] -v1689(VarCurr)|v1681(VarCurr).
% 94.70/94.12  0 [] v1689(VarCurr)| -v1690(VarCurr)| -v1681(VarCurr).
% 94.70/94.12  0 [] -v1690(VarCurr)|v1678(VarCurr).
% 94.70/94.12  0 [] -v1690(VarCurr)|v1180(VarCurr).
% 94.70/94.12  0 [] v1690(VarCurr)| -v1678(VarCurr)| -v1180(VarCurr).
% 94.70/94.12  0 [] -v1685(VarCurr)|v1671(VarCurr).
% 94.70/94.12  0 [] -v1685(VarCurr)|v1180(VarCurr).
% 94.70/94.12  0 [] v1685(VarCurr)| -v1671(VarCurr)| -v1180(VarCurr).
% 94.70/94.12  0 [] -v1671(VarCurr)|v1672(VarCurr).
% 94.70/94.12  0 [] -v1671(VarCurr)|v1347(VarCurr).
% 94.70/94.12  0 [] v1671(VarCurr)| -v1672(VarCurr)| -v1347(VarCurr).
% 94.70/94.12  0 [] -v1666(VarCurr)|v1667(VarCurr).
% 94.70/94.12  0 [] -v1666(VarCurr)|v1348(VarCurr).
% 94.70/94.12  0 [] v1666(VarCurr)| -v1667(VarCurr)| -v1348(VarCurr).
% 94.70/94.12  0 [] -v1667(VarCurr)|v1668(VarCurr)|v1675(VarCurr).
% 94.70/94.12  0 [] v1667(VarCurr)| -v1668(VarCurr).
% 94.70/94.12  0 [] v1667(VarCurr)| -v1675(VarCurr).
% 94.70/94.12  0 [] -v1675(VarCurr)|v1677(VarCurr).
% 94.70/94.12  0 [] -v1675(VarCurr)|v1682(VarCurr).
% 94.70/94.12  0 [] v1675(VarCurr)| -v1677(VarCurr)| -v1682(VarCurr).
% 94.70/94.12  0 [] v1682(VarCurr)|v907(VarCurr).
% 94.70/94.12  0 [] -v1682(VarCurr)| -v907(VarCurr).
% 94.70/94.12  0 [] -v1677(VarCurr)|v1678(VarCurr).
% 94.70/94.12  0 [] -v1677(VarCurr)|v1681(VarCurr).
% 94.70/94.12  0 [] v1677(VarCurr)| -v1678(VarCurr)| -v1681(VarCurr).
% 94.70/94.12  0 [] v1681(VarCurr)|v1162(VarCurr).
% 94.70/94.12  0 [] -v1681(VarCurr)| -v1162(VarCurr).
% 94.70/94.12  0 [] -v1678(VarCurr)|v1679(VarCurr).
% 94.70/94.12  0 [] -v1678(VarCurr)|v1347(VarCurr).
% 94.70/94.12  0 [] v1678(VarCurr)| -v1679(VarCurr)| -v1347(VarCurr).
% 94.70/94.12  0 [] -v1679(VarCurr)|v1680(VarCurr).
% 94.70/94.12  0 [] -v1679(VarCurr)|v1346(VarCurr).
% 94.70/94.12  0 [] v1679(VarCurr)| -v1680(VarCurr)| -v1346(VarCurr).
% 94.70/94.12  0 [] -v1680(VarCurr)|v87(VarCurr).
% 94.70/94.12  0 [] -v1680(VarCurr)|v1674(VarCurr).
% 94.70/94.12  0 [] v1680(VarCurr)| -v87(VarCurr)| -v1674(VarCurr).
% 94.70/94.12  0 [] -v1668(VarCurr)|v1669(VarCurr)|v1670(VarCurr).
% 94.70/94.12  0 [] v1668(VarCurr)| -v1669(VarCurr).
% 94.70/94.12  0 [] v1668(VarCurr)| -v1670(VarCurr).
% 94.70/94.12  0 [] -v1670(VarCurr)|v1672(VarCurr).
% 94.70/94.12  0 [] -v1670(VarCurr)|v1347(VarCurr).
% 94.70/94.12  0 [] v1670(VarCurr)| -v1672(VarCurr)| -v1347(VarCurr).
% 94.70/94.12  0 [] -v1672(VarCurr)|v1673(VarCurr).
% 94.70/94.12  0 [] -v1672(VarCurr)|v1346(VarCurr).
% 94.70/94.12  0 [] v1672(VarCurr)| -v1673(VarCurr)| -v1346(VarCurr).
% 94.70/94.12  0 [] -v1673(VarCurr)|v1345(VarCurr).
% 94.70/94.12  0 [] -v1673(VarCurr)|v1674(VarCurr).
% 94.70/94.12  0 [] v1673(VarCurr)| -v1345(VarCurr)| -v1674(VarCurr).
% 94.70/94.12  0 [] v1674(VarCurr)|v881(VarCurr).
% 94.70/94.12  0 [] -v1674(VarCurr)| -v881(VarCurr).
% 94.70/94.12  0 [] -v1669(VarCurr)|v38(VarCurr).
% 94.70/94.12  0 [] v1669(VarCurr)| -v38(VarCurr).
% 94.70/94.12  0 [] -v1180(VarCurr)|v1182(VarCurr).
% 94.70/94.12  0 [] v1180(VarCurr)| -v1182(VarCurr).
% 94.70/94.12  0 [] -v1182(VarCurr)|v1184(VarCurr).
% 94.70/94.12  0 [] v1182(VarCurr)| -v1184(VarCurr).
% 94.70/94.12  0 [] -v1184(VarCurr)|v1186(VarCurr).
% 94.70/94.12  0 [] -v1184(VarCurr)|v1656(VarCurr).
% 94.70/94.12  0 [] v1184(VarCurr)| -v1186(VarCurr)| -v1656(VarCurr).
% 94.70/94.12  0 [] -v1656(VarCurr)|v1377(VarCurr,bitIndex2)|v1377(VarCurr,bitIndex4).
% 94.70/94.12  0 [] v1656(VarCurr)| -v1377(VarCurr,bitIndex2).
% 94.70/94.12  0 [] v1656(VarCurr)| -v1377(VarCurr,bitIndex4).
% 94.70/94.12  0 [] -v1186(VarCurr)|v1188(VarCurr).
% 94.70/94.12  0 [] v1186(VarCurr)| -v1188(VarCurr).
% 94.70/94.12  0 [] -v1188(VarCurr)|v1190(VarCurr).
% 94.70/94.12  0 [] v1188(VarCurr)| -v1190(VarCurr).
% 94.70/94.12  0 [] -v1190(VarCurr)|v1192(VarCurr).
% 94.70/94.12  0 [] v1190(VarCurr)| -v1192(VarCurr).
% 94.70/94.12  0 [] -v1192(VarCurr)|v1194(VarCurr).
% 94.70/94.12  0 [] v1192(VarCurr)| -v1194(VarCurr).
% 94.70/94.12  0 [] -v1194(VarCurr)|v1196(VarCurr).
% 94.70/94.12  0 [] v1194(VarCurr)| -v1196(VarCurr).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1643(VarNext)| -v1196(VarNext)|v1196(VarCurr).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1643(VarNext)|v1196(VarNext)| -v1196(VarCurr).
% 94.70/94.12  0 [] -v1643(VarNext)| -v1196(VarNext)|v1651(VarNext).
% 94.70/94.12  0 [] -v1643(VarNext)|v1196(VarNext)| -v1651(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)| -v1651(VarNext)|v1649(VarCurr).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1651(VarNext)| -v1649(VarCurr).
% 94.70/94.12  0 [] v1652(VarCurr)| -v1649(VarCurr)|v1202(VarCurr).
% 94.70/94.12  0 [] v1652(VarCurr)|v1649(VarCurr)| -v1202(VarCurr).
% 94.70/94.12  0 [] -v1652(VarCurr)| -v1649(VarCurr)|$F.
% 94.70/94.12  0 [] -v1652(VarCurr)|v1649(VarCurr)| -$F.
% 94.70/94.12  0 [] v1652(VarCurr)|v1198(VarCurr).
% 94.70/94.12  0 [] -v1652(VarCurr)| -v1198(VarCurr).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)| -v1643(VarNext)|v1644(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1643(VarNext)| -v1644(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)| -v1644(VarNext)|v1645(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)| -v1644(VarNext)|v1540(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1644(VarNext)| -v1645(VarNext)| -v1540(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)|v1645(VarNext)|v1549(VarNext).
% 94.70/94.12  0 [] -nextState(VarCurr,VarNext)| -v1645(VarNext)| -v1549(VarNext).
% 94.70/94.12  0 [] v1602(VarCurr)| -v1202(VarCurr)|$F.
% 94.70/94.12  0 [] v1602(VarCurr)|v1202(VarCurr)| -$F.
% 94.70/94.12  0 [] -v1602(VarCurr)| -v1202(VarCurr)|v1626(VarCurr).
% 94.70/94.12  0 [] -v1602(VarCurr)|v1202(VarCurr)| -v1626(VarCurr).
% 94.70/94.12  0 [] v1563(VarCurr)| -v1626(VarCurr)|$F.
% 94.70/94.12  0 [] v1563(VarCurr)|v1626(VarCurr)| -$F.
% 94.70/94.12  0 [] -v1563(VarCurr)| -v1626(VarCurr)|v1627(VarCurr).
% 94.70/94.12  0 [] -v1563(VarCurr)|v1626(VarCurr)| -v1627(VarCurr).
% 94.70/94.12  0 [] -v1633(VarCurr)|v1635(VarCurr)|v1615(VarCurr).
% 94.70/94.12  0 [] v1633(VarCurr)| -v1635(VarCurr).
% 94.70/94.12  0 [] v1633(VarCurr)| -v1615(VarCurr).
% 94.70/94.12  0 [] -v1635(VarCurr)|v1636(VarCurr)|v1614(VarCurr).
% 94.70/94.12  0 [] v1635(VarCurr)| -v1636(VarCurr).
% 94.70/94.12  0 [] v1635(VarCurr)| -v1614(VarCurr).
% 94.70/94.12  0 [] -v1636(VarCurr)|v1637(VarCurr)|v1613(VarCurr).
% 94.70/94.12  0 [] v1636(VarCurr)| -v1637(VarCurr).
% 94.70/94.12  0 [] v1636(VarCurr)| -v1613(VarCurr).
% 94.70/94.12  0 [] -v1637(VarCurr)|v1638(VarCurr)|v1583(VarCurr).
% 94.70/94.12  0 [] v1637(VarCurr)| -v1638(VarCurr).
% 94.70/94.12  0 [] v1637(VarCurr)| -v1583(VarCurr).
% 94.70/94.12  0 [] -v1638(VarCurr)|v1639(VarCurr)|v1582(VarCurr).
% 94.70/94.12  0 [] v1638(VarCurr)| -v1639(VarCurr).
% 94.70/94.12  0 [] v1638(VarCurr)| -v1582(VarCurr).
% 94.70/94.12  0 [] -v1639(VarCurr)|v1640(VarCurr)|v1581(VarCurr).
% 94.70/94.12  0 [] v1639(VarCurr)| -v1640(VarCurr).
% 94.70/94.12  0 [] v1639(VarCurr)| -v1581(VarCurr).
% 94.70/94.12  0 [] -v1640(VarCurr)|v1566(VarCurr)|v1580(VarCurr).
% 94.70/94.12  0 [] v1640(VarCurr)| -v1566(VarCurr).
% 94.70/94.12  0 [] v1640(VarCurr)| -v1580(VarCurr).
% 94.70/94.12  0 [] -v1566(VarCurr)|v1567(VarCurr)|v1572(VarCurr).
% 94.70/94.12  0 [] v1566(VarCurr)| -v1567(VarCurr).
% 94.70/94.12  0 [] v1566(VarCurr)| -v1572(VarCurr).
% 94.70/94.12  0 [] v1208(VarCurr)| -v1627(VarCurr)|$F.
% 94.70/94.12  0 [] v1208(VarCurr)|v1627(VarCurr)| -$F.
% 94.70/94.12  0 [] -v1208(VarCurr)| -v1627(VarCurr)|v1628(VarCurr).
% 94.70/94.12  0 [] -v1208(VarCurr)|v1627(VarCurr)| -v1628(VarCurr).
% 94.70/94.12  0 [] v1629(VarCurr)| -v1628(VarCurr)|$T.
% 94.70/94.12  0 [] v1629(VarCurr)|v1628(VarCurr)| -$T.
% 94.70/94.12  0 [] -v1629(VarCurr)| -v1628(VarCurr)|$F.
% 94.70/94.12  0 [] -v1629(VarCurr)|v1628(VarCurr)| -$F.
% 94.70/94.12  0 [] -v1629(VarCurr)|v1630(VarCurr).
% 94.70/94.12  0 [] -v1629(VarCurr)|v1538(VarCurr).
% 94.70/94.12  0 [] v1629(VarCurr)| -v1630(VarCurr)| -v1538(VarCurr).
% 94.70/94.12  0 [] -v1630(VarCurr)|v1631(VarCurr)|v1632(VarCurr).
% 94.70/94.12  0 [] v1630(VarCurr)| -v1631(VarCurr).
% 94.70/94.12  0 [] v1630(VarCurr)| -v1632(VarCurr).
% 94.70/94.12  0 [] -v1632(VarCurr)| -v1497(VarCurr,bitIndex3)|$T.
% 94.70/94.12  0 [] -v1632(VarCurr)|v1497(VarCurr,bitIndex3)| -$T.
% 94.70/94.12  0 [] -v1632(VarCurr)| -v1497(VarCurr,bitIndex2)|$T.
% 94.70/94.12  0 [] -v1632(VarCurr)|v1497(VarCurr,bitIndex2)| -$T.
% 94.70/94.13  0 [] -v1632(VarCurr)| -v1497(VarCurr,bitIndex1)|$F.
% 94.70/94.13  0 [] -v1632(VarCurr)|v1497(VarCurr,bitIndex1)| -$F.
% 94.70/94.13  0 [] -v1632(VarCurr)| -v1497(VarCurr,bitIndex0)|$T.
% 94.70/94.13  0 [] -v1632(VarCurr)|v1497(VarCurr,bitIndex0)| -$T.
% 94.70/94.13  0 [] v1632(VarCurr)|v1497(VarCurr,bitIndex3)|$T|v1497(VarCurr,bitIndex2)|v1497(VarCurr,bitIndex1)|$F|v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1632(VarCurr)|v1497(VarCurr,bitIndex3)|$T|v1497(VarCurr,bitIndex2)| -v1497(VarCurr,bitIndex1)| -$F|v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1632(VarCurr)| -v1497(VarCurr,bitIndex3)| -$T| -v1497(VarCurr,bitIndex2)|v1497(VarCurr,bitIndex1)|$F| -v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1632(VarCurr)| -v1497(VarCurr,bitIndex3)| -$T| -v1497(VarCurr,bitIndex2)| -v1497(VarCurr,bitIndex1)| -$F| -v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] -v1631(VarCurr)| -v1497(VarCurr,bitIndex3)|$F.
% 94.70/94.13  0 [] -v1631(VarCurr)|v1497(VarCurr,bitIndex3)| -$F.
% 94.70/94.13  0 [] -v1631(VarCurr)| -v1497(VarCurr,bitIndex2)|$T.
% 94.70/94.13  0 [] -v1631(VarCurr)|v1497(VarCurr,bitIndex2)| -$T.
% 94.70/94.13  0 [] -v1631(VarCurr)| -v1497(VarCurr,bitIndex1)|$F.
% 94.70/94.13  0 [] -v1631(VarCurr)|v1497(VarCurr,bitIndex1)| -$F.
% 94.70/94.13  0 [] -v1631(VarCurr)| -v1497(VarCurr,bitIndex0)|$T.
% 94.70/94.13  0 [] -v1631(VarCurr)|v1497(VarCurr,bitIndex0)| -$T.
% 94.70/94.13  0 [] v1631(VarCurr)|v1497(VarCurr,bitIndex3)|$F|v1497(VarCurr,bitIndex2)|$T|v1497(VarCurr,bitIndex1)|v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1631(VarCurr)|v1497(VarCurr,bitIndex3)|$F| -v1497(VarCurr,bitIndex2)| -$T|v1497(VarCurr,bitIndex1)| -v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1631(VarCurr)| -v1497(VarCurr,bitIndex3)| -$F|v1497(VarCurr,bitIndex2)|$T| -v1497(VarCurr,bitIndex1)|v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1631(VarCurr)| -v1497(VarCurr,bitIndex3)| -$F| -v1497(VarCurr,bitIndex2)| -$T| -v1497(VarCurr,bitIndex1)| -v1497(VarCurr,bitIndex0).
% 94.70/94.13  0 [] -v1602(VarCurr)|v1603(VarCurr)|v1615(VarCurr).
% 94.70/94.13  0 [] v1602(VarCurr)| -v1603(VarCurr).
% 94.70/94.13  0 [] v1602(VarCurr)| -v1615(VarCurr).
% 94.70/94.13  0 [] v1615(VarCurr)|v1616(VarCurr).
% 94.70/94.13  0 [] -v1615(VarCurr)| -v1616(VarCurr).
% 94.70/94.13  0 [] -v1616(VarCurr)|v1617(VarCurr)|v1584(VarCurr).
% 94.70/94.13  0 [] v1616(VarCurr)| -v1617(VarCurr).
% 94.70/94.13  0 [] v1616(VarCurr)| -v1584(VarCurr).
% 94.70/94.13  0 [] -v1617(VarCurr)|v1618(VarCurr)|v1583(VarCurr).
% 94.70/94.13  0 [] v1617(VarCurr)| -v1618(VarCurr).
% 94.70/94.13  0 [] v1617(VarCurr)| -v1583(VarCurr).
% 94.70/94.13  0 [] -v1618(VarCurr)|v1619(VarCurr)|v1582(VarCurr).
% 94.70/94.13  0 [] v1618(VarCurr)| -v1619(VarCurr).
% 94.70/94.13  0 [] v1618(VarCurr)| -v1582(VarCurr).
% 94.70/94.13  0 [] -v1619(VarCurr)|v1620(VarCurr)|v1581(VarCurr).
% 94.70/94.13  0 [] v1619(VarCurr)| -v1620(VarCurr).
% 94.70/94.13  0 [] v1619(VarCurr)| -v1581(VarCurr).
% 94.70/94.13  0 [] -v1620(VarCurr)|v1621(VarCurr)|v1580(VarCurr).
% 94.70/94.13  0 [] v1620(VarCurr)| -v1621(VarCurr).
% 94.70/94.13  0 [] v1620(VarCurr)| -v1580(VarCurr).
% 94.70/94.13  0 [] -v1621(VarCurr)|v1622(VarCurr)|v1573(VarCurr).
% 94.70/94.13  0 [] v1621(VarCurr)| -v1622(VarCurr).
% 94.70/94.13  0 [] v1621(VarCurr)| -v1573(VarCurr).
% 94.70/94.13  0 [] -v1622(VarCurr)|v1623(VarCurr)|v1572(VarCurr).
% 94.70/94.13  0 [] v1622(VarCurr)| -v1623(VarCurr).
% 94.70/94.13  0 [] v1622(VarCurr)| -v1572(VarCurr).
% 94.70/94.13  0 [] -v1623(VarCurr)|v1624(VarCurr)|v1571(VarCurr).
% 94.70/94.13  0 [] v1623(VarCurr)| -v1624(VarCurr).
% 94.70/94.13  0 [] v1623(VarCurr)| -v1571(VarCurr).
% 94.70/94.13  0 [] -v1624(VarCurr)|v1625(VarCurr)|v1570(VarCurr).
% 94.70/94.13  0 [] v1624(VarCurr)| -v1625(VarCurr).
% 94.70/94.13  0 [] v1624(VarCurr)| -v1570(VarCurr).
% 94.70/94.13  0 [] -v1625(VarCurr)|v1563(VarCurr)|v1569(VarCurr).
% 94.70/94.13  0 [] v1625(VarCurr)| -v1563(VarCurr).
% 94.70/94.13  0 [] v1625(VarCurr)| -v1569(VarCurr).
% 94.70/94.13  0 [] -v1603(VarCurr)|v1604(VarCurr)|v1614(VarCurr).
% 94.70/94.13  0 [] v1603(VarCurr)| -v1604(VarCurr).
% 94.70/94.13  0 [] v1603(VarCurr)| -v1614(VarCurr).
% 94.70/94.13  0 [] -v1614(VarCurr)|v1586(VarCurr).
% 94.70/94.13  0 [] -v1614(VarCurr)|v1584(VarCurr).
% 94.70/94.13  0 [] v1614(VarCurr)| -v1586(VarCurr)| -v1584(VarCurr).
% 94.70/94.13  0 [] -v1604(VarCurr)|v1605(VarCurr)|v1583(VarCurr).
% 94.70/94.13  0 [] v1604(VarCurr)| -v1605(VarCurr).
% 94.70/94.13  0 [] v1604(VarCurr)| -v1583(VarCurr).
% 94.70/94.13  0 [] -v1605(VarCurr)|v1606(VarCurr)|v1582(VarCurr).
% 94.70/94.13  0 [] v1605(VarCurr)| -v1606(VarCurr).
% 94.70/94.13  0 [] v1605(VarCurr)| -v1582(VarCurr).
% 94.70/94.13  0 [] -v1606(VarCurr)|v1607(VarCurr)|v1581(VarCurr).
% 94.70/94.13  0 [] v1606(VarCurr)| -v1607(VarCurr).
% 94.70/94.13  0 [] v1606(VarCurr)| -v1581(VarCurr).
% 94.70/94.13  0 [] -v1607(VarCurr)|v1608(VarCurr)|v1580(VarCurr).
% 94.70/94.13  0 [] v1607(VarCurr)| -v1608(VarCurr).
% 94.70/94.13  0 [] v1607(VarCurr)| -v1580(VarCurr).
% 94.70/94.13  0 [] -v1608(VarCurr)|v1609(VarCurr)|v1613(VarCurr).
% 94.70/94.13  0 [] v1608(VarCurr)| -v1609(VarCurr).
% 94.70/94.13  0 [] v1608(VarCurr)| -v1613(VarCurr).
% 94.70/94.13  0 [] -v1613(VarCurr)|v1575(VarCurr).
% 94.70/94.13  0 [] -v1613(VarCurr)|v1573(VarCurr).
% 94.70/94.13  0 [] v1613(VarCurr)| -v1575(VarCurr)| -v1573(VarCurr).
% 94.70/94.13  0 [] -v1609(VarCurr)|v1610(VarCurr)|v1572(VarCurr).
% 94.70/94.13  0 [] v1609(VarCurr)| -v1610(VarCurr).
% 94.70/94.13  0 [] v1609(VarCurr)| -v1572(VarCurr).
% 94.70/94.13  0 [] -v1610(VarCurr)|v1611(VarCurr)|v1571(VarCurr).
% 94.70/94.13  0 [] v1610(VarCurr)| -v1611(VarCurr).
% 94.70/94.13  0 [] v1610(VarCurr)| -v1571(VarCurr).
% 94.70/94.13  0 [] -v1611(VarCurr)|v1612(VarCurr)|v1570(VarCurr).
% 94.70/94.13  0 [] v1611(VarCurr)| -v1612(VarCurr).
% 94.70/94.13  0 [] v1611(VarCurr)| -v1570(VarCurr).
% 94.70/94.13  0 [] -v1612(VarCurr)|v1563(VarCurr)|v1569(VarCurr).
% 94.70/94.13  0 [] v1612(VarCurr)| -v1563(VarCurr).
% 94.70/94.13  0 [] v1612(VarCurr)| -v1569(VarCurr).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)|v1589(VarNext)| -range_3_0(B)| -v1204(VarNext,B)|v1204(VarCurr,B).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)|v1589(VarNext)| -range_3_0(B)|v1204(VarNext,B)| -v1204(VarCurr,B).
% 94.70/94.13  0 [] -v1589(VarNext)| -range_3_0(B)| -v1204(VarNext,B)|v1597(VarNext,B).
% 94.70/94.13  0 [] -v1589(VarNext)| -range_3_0(B)|v1204(VarNext,B)| -v1597(VarNext,B).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v1597(VarNext,B)|v1595(VarCurr,B).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v1597(VarNext,B)| -v1595(VarCurr,B).
% 94.70/94.13  0 [] v1598(VarCurr)| -range_3_0(B)| -v1595(VarCurr,B)|v1206(VarCurr,B).
% 94.70/94.13  0 [] v1598(VarCurr)| -range_3_0(B)|v1595(VarCurr,B)| -v1206(VarCurr,B).
% 94.70/94.13  0 [] -v1598(VarCurr)| -range_3_0(B)| -v1595(VarCurr,B)|$F.
% 94.70/94.13  0 [] -v1598(VarCurr)| -range_3_0(B)|v1595(VarCurr,B)| -$F.
% 94.70/94.13  0 [] v1598(VarCurr)|v1198(VarCurr).
% 94.70/94.13  0 [] -v1598(VarCurr)| -v1198(VarCurr).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -v1589(VarNext)|v1590(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)|v1589(VarNext)| -v1590(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -v1590(VarNext)|v1591(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -v1590(VarNext)|v1540(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)|v1590(VarNext)| -v1591(VarNext)| -v1540(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)|v1591(VarNext)|v1549(VarNext).
% 94.70/94.13  0 [] -nextState(VarCurr,VarNext)| -v1591(VarNext)| -v1549(VarNext).
% 94.70/94.13  0 [] v1563(VarCurr)|v1565(VarCurr)|v1573(VarCurr)|v1576(VarCurr)|v1584(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|$F.
% 94.70/94.13  0 [] v1563(VarCurr)|v1565(VarCurr)|v1573(VarCurr)|v1576(VarCurr)|v1584(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -$F.
% 94.70/94.13  0 [] -v1584(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|v1585(VarCurr,B).
% 94.70/94.13  0 [] -v1584(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -v1585(VarCurr,B).
% 94.70/94.13  0 [] -v1576(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|$F.
% 94.70/94.13  0 [] -v1576(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -$F.
% 94.70/94.13  0 [] -v1573(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|v1574(VarCurr,B).
% 94.70/94.13  0 [] -v1573(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -v1574(VarCurr,B).
% 94.70/94.13  0 [] -v1565(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|$F.
% 94.70/94.13  0 [] -v1565(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -$F.
% 94.70/94.13  0 [] -v1563(VarCurr)| -range_3_0(B)| -v1206(VarCurr,B)|v1564(VarCurr,B).
% 94.70/94.13  0 [] -v1563(VarCurr)| -range_3_0(B)|v1206(VarCurr,B)| -v1564(VarCurr,B).
% 94.70/94.13  0 [] v1586(VarCurr)| -range_3_0(B)| -v1585(VarCurr,B)|$F.
% 94.70/94.13  0 [] v1586(VarCurr)| -range_3_0(B)|v1585(VarCurr,B)| -$F.
% 94.70/94.13  0 [] -v1586(VarCurr)| -range_3_0(B)| -v1585(VarCurr,B)|$F.
% 94.70/94.13  0 [] -v1586(VarCurr)| -range_3_0(B)|v1585(VarCurr,B)| -$F.
% 94.70/94.13  0 [] v1586(VarCurr)|v1536(VarCurr).
% 94.70/94.13  0 [] -v1586(VarCurr)| -v1536(VarCurr).
% 94.70/94.13  0 [] -v1584(VarCurr)| -v1204(VarCurr,bitIndex3)|$T.
% 94.70/94.13  0 [] -v1584(VarCurr)|v1204(VarCurr,bitIndex3)| -$T.
% 94.70/94.13  0 [] -v1584(VarCurr)| -v1204(VarCurr,bitIndex2)|$T.
% 94.70/94.13  0 [] -v1584(VarCurr)|v1204(VarCurr,bitIndex2)| -$T.
% 94.70/94.13  0 [] -v1584(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.70/94.13  0 [] -v1584(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.70/94.13  0 [] -v1584(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.70/94.13  0 [] -v1584(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.70/94.13  0 [] v1584(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$F|v1204(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1584(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$F|v1204(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1584(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$F| -v1204(VarCurr,bitIndex0).
% 94.70/94.13  0 [] v1584(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$F| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] b1101(bitIndex3).
% 94.70/94.14  0 [] b1101(bitIndex2).
% 94.70/94.14  0 [] -b1101(bitIndex1).
% 94.70/94.14  0 [] b1101(bitIndex0).
% 94.70/94.14  0 [] -v1576(VarCurr)|v1578(VarCurr)|v1583(VarCurr).
% 94.70/94.14  0 [] v1576(VarCurr)| -v1578(VarCurr).
% 94.70/94.14  0 [] v1576(VarCurr)| -v1583(VarCurr).
% 94.70/94.14  0 [] -v1583(VarCurr)| -v1204(VarCurr,bitIndex3)|$T.
% 94.70/94.14  0 [] -v1583(VarCurr)|v1204(VarCurr,bitIndex3)| -$T.
% 94.70/94.14  0 [] -v1583(VarCurr)| -v1204(VarCurr,bitIndex2)|$T.
% 94.70/94.14  0 [] -v1583(VarCurr)|v1204(VarCurr,bitIndex2)| -$T.
% 94.70/94.14  0 [] -v1583(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.70/94.14  0 [] -v1583(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.70/94.14  0 [] -v1583(VarCurr)| -v1204(VarCurr,bitIndex0)|$F.
% 94.70/94.14  0 [] -v1583(VarCurr)|v1204(VarCurr,bitIndex0)| -$F.
% 94.70/94.14  0 [] v1583(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$F|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1583(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$F| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1583(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$F|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1583(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$F| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] -v1578(VarCurr)|v1579(VarCurr)|v1582(VarCurr).
% 94.70/94.14  0 [] v1578(VarCurr)| -v1579(VarCurr).
% 94.70/94.14  0 [] v1578(VarCurr)| -v1582(VarCurr).
% 94.70/94.14  0 [] -v1582(VarCurr)| -v1204(VarCurr,bitIndex3)|$T.
% 94.70/94.14  0 [] -v1582(VarCurr)|v1204(VarCurr,bitIndex3)| -$T.
% 94.70/94.14  0 [] -v1582(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.70/94.14  0 [] -v1582(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.70/94.14  0 [] -v1582(VarCurr)| -v1204(VarCurr,bitIndex1)|$T.
% 94.70/94.14  0 [] -v1582(VarCurr)|v1204(VarCurr,bitIndex1)| -$T.
% 94.70/94.14  0 [] -v1582(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.70/94.14  0 [] -v1582(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.70/94.14  0 [] v1582(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)|$F|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1582(VarCurr)|v1204(VarCurr,bitIndex3)|$T| -v1204(VarCurr,bitIndex2)| -$F|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1582(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T|v1204(VarCurr,bitIndex2)|$F| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1582(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)| -$F| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] b1011(bitIndex3).
% 94.70/94.14  0 [] -b1011(bitIndex2).
% 94.70/94.14  0 [] b1011(bitIndex1).
% 94.70/94.14  0 [] b1011(bitIndex0).
% 94.70/94.14  0 [] -v1579(VarCurr)|v1580(VarCurr)|v1581(VarCurr).
% 94.70/94.14  0 [] v1579(VarCurr)| -v1580(VarCurr).
% 94.70/94.14  0 [] v1579(VarCurr)| -v1581(VarCurr).
% 94.70/94.14  0 [] -v1581(VarCurr)| -v1204(VarCurr,bitIndex3)|$T.
% 94.70/94.14  0 [] -v1581(VarCurr)|v1204(VarCurr,bitIndex3)| -$T.
% 94.70/94.14  0 [] -v1581(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.70/94.14  0 [] -v1581(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.70/94.14  0 [] -v1581(VarCurr)| -v1204(VarCurr,bitIndex1)|$T.
% 94.70/94.14  0 [] -v1581(VarCurr)|v1204(VarCurr,bitIndex1)| -$T.
% 94.70/94.14  0 [] -v1581(VarCurr)| -v1204(VarCurr,bitIndex0)|$F.
% 94.70/94.14  0 [] -v1581(VarCurr)|v1204(VarCurr,bitIndex0)| -$F.
% 94.70/94.14  0 [] v1581(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)|$F|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1581(VarCurr)|v1204(VarCurr,bitIndex3)|$T| -v1204(VarCurr,bitIndex2)| -$F|v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1581(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T|v1204(VarCurr,bitIndex2)|$F| -v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] v1581(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)| -$F| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.70/94.14  0 [] b1010(bitIndex3).
% 94.70/94.14  0 [] -b1010(bitIndex2).
% 94.70/94.14  0 [] b1010(bitIndex1).
% 94.70/94.14  0 [] -b1010(bitIndex0).
% 94.70/94.14  0 [] -v1580(VarCurr)| -v1204(VarCurr,bitIndex3)|$T.
% 94.70/94.14  0 [] -v1580(VarCurr)|v1204(VarCurr,bitIndex3)| -$T.
% 94.70/94.14  0 [] -v1580(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.70/94.14  0 [] -v1580(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.70/94.14  0 [] -v1580(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.70/94.14  0 [] -v1580(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.70/94.14  0 [] -v1580(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.70/94.14  0 [] -v1580(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.70/94.14  0 [] v1580(VarCurr)|v1204(VarCurr,bitIndex3)|$T|v1204(VarCurr,bitIndex2)|$F|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1580(VarCurr)|v1204(VarCurr,bitIndex3)|$T| -v1204(VarCurr,bitIndex2)| -$F| -v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1580(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T|v1204(VarCurr,bitIndex2)|$F|v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1580(VarCurr)| -v1204(VarCurr,bitIndex3)| -$T| -v1204(VarCurr,bitIndex2)| -$F| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] b1001(bitIndex3).
% 94.80/94.14  0 [] -b1001(bitIndex2).
% 94.80/94.14  0 [] -b1001(bitIndex1).
% 94.80/94.14  0 [] b1001(bitIndex0).
% 94.80/94.14  0 [] v1575(VarCurr)| -range_3_0(B)| -v1574(VarCurr,B)|$F.
% 94.80/94.14  0 [] v1575(VarCurr)| -range_3_0(B)|v1574(VarCurr,B)| -$F.
% 94.80/94.14  0 [] -v1575(VarCurr)| -range_3_0(B)| -v1574(VarCurr,B)|$F.
% 94.80/94.14  0 [] -v1575(VarCurr)| -range_3_0(B)|v1574(VarCurr,B)| -$F.
% 94.80/94.14  0 [] v1575(VarCurr)|v1536(VarCurr).
% 94.80/94.14  0 [] -v1575(VarCurr)| -v1536(VarCurr).
% 94.80/94.14  0 [] -v1573(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.14  0 [] -v1573(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.14  0 [] -v1573(VarCurr)| -v1204(VarCurr,bitIndex2)|$T.
% 94.80/94.14  0 [] -v1573(VarCurr)|v1204(VarCurr,bitIndex2)| -$T.
% 94.80/94.14  0 [] -v1573(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.80/94.14  0 [] -v1573(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.80/94.14  0 [] -v1573(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.80/94.14  0 [] -v1573(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.80/94.14  0 [] v1573(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|$T|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1573(VarCurr)|v1204(VarCurr,bitIndex3)|$F| -v1204(VarCurr,bitIndex2)| -$T|v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1573(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F|v1204(VarCurr,bitIndex2)|$T| -v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1573(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -$T| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] -v1565(VarCurr)|v1567(VarCurr)|v1572(VarCurr).
% 94.80/94.14  0 [] v1565(VarCurr)| -v1567(VarCurr).
% 94.80/94.14  0 [] v1565(VarCurr)| -v1572(VarCurr).
% 94.80/94.14  0 [] -v1572(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.14  0 [] -v1572(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.14  0 [] -v1572(VarCurr)| -v1204(VarCurr,bitIndex2)|$T.
% 94.80/94.14  0 [] -v1572(VarCurr)|v1204(VarCurr,bitIndex2)| -$T.
% 94.80/94.14  0 [] -v1572(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.80/94.14  0 [] -v1572(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.80/94.14  0 [] -v1572(VarCurr)| -v1204(VarCurr,bitIndex0)|$F.
% 94.80/94.14  0 [] -v1572(VarCurr)|v1204(VarCurr,bitIndex0)| -$F.
% 94.80/94.14  0 [] v1572(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|$T|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1572(VarCurr)|v1204(VarCurr,bitIndex3)|$F| -v1204(VarCurr,bitIndex2)| -$T|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1572(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F|v1204(VarCurr,bitIndex2)|$T| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1572(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -$T| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] -b0100(bitIndex3).
% 94.80/94.14  0 [] b0100(bitIndex2).
% 94.80/94.14  0 [] -b0100(bitIndex1).
% 94.80/94.14  0 [] -b0100(bitIndex0).
% 94.80/94.14  0 [] -v1567(VarCurr)|v1568(VarCurr)|v1571(VarCurr).
% 94.80/94.14  0 [] v1567(VarCurr)| -v1568(VarCurr).
% 94.80/94.14  0 [] v1567(VarCurr)| -v1571(VarCurr).
% 94.80/94.14  0 [] -v1571(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.14  0 [] -v1571(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.14  0 [] -v1571(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.80/94.14  0 [] -v1571(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.80/94.14  0 [] -v1571(VarCurr)| -v1204(VarCurr,bitIndex1)|$T.
% 94.80/94.14  0 [] -v1571(VarCurr)|v1204(VarCurr,bitIndex1)| -$T.
% 94.80/94.14  0 [] -v1571(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.80/94.14  0 [] -v1571(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.80/94.14  0 [] v1571(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$T|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1571(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$T| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1571(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$T|v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] v1571(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$T| -v1204(VarCurr,bitIndex0).
% 94.80/94.14  0 [] -v1568(VarCurr)|v1569(VarCurr)|v1570(VarCurr).
% 94.80/94.14  0 [] v1568(VarCurr)| -v1569(VarCurr).
% 94.80/94.15  0 [] v1568(VarCurr)| -v1570(VarCurr).
% 94.80/94.15  0 [] -v1570(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.15  0 [] -v1570(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.15  0 [] -v1570(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.80/94.15  0 [] -v1570(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.80/94.15  0 [] -v1570(VarCurr)| -v1204(VarCurr,bitIndex1)|$T.
% 94.80/94.15  0 [] -v1570(VarCurr)|v1204(VarCurr,bitIndex1)| -$T.
% 94.80/94.15  0 [] -v1570(VarCurr)| -v1204(VarCurr,bitIndex0)|$F.
% 94.80/94.15  0 [] -v1570(VarCurr)|v1204(VarCurr,bitIndex0)| -$F.
% 94.80/94.15  0 [] v1570(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$T|v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] v1570(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$T|v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] v1570(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|$T| -v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] v1570(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -$T| -v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] -b0010(bitIndex3).
% 94.80/94.15  0 [] -b0010(bitIndex2).
% 94.80/94.15  0 [] b0010(bitIndex1).
% 94.80/94.15  0 [] -b0010(bitIndex0).
% 94.80/94.15  0 [] -v1569(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.15  0 [] -v1569(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.15  0 [] -v1569(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.80/94.15  0 [] -v1569(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.80/94.15  0 [] -v1569(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.80/94.15  0 [] -v1569(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.80/94.15  0 [] -v1569(VarCurr)| -v1204(VarCurr,bitIndex0)|$T.
% 94.80/94.15  0 [] -v1569(VarCurr)|v1204(VarCurr,bitIndex0)| -$T.
% 94.80/94.15  0 [] v1569(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0)|$T.
% 94.80/94.15  0 [] v1569(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0)| -$T.
% 94.80/94.15  0 [] v1569(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0)|$T.
% 94.80/94.15  0 [] v1569(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0)| -$T.
% 94.80/94.15  0 [] v1208(VarCurr)| -range_3_0(B)| -v1564(VarCurr,B)|$F.
% 94.80/94.15  0 [] v1208(VarCurr)| -range_3_0(B)|v1564(VarCurr,B)| -$F.
% 94.80/94.15  0 [] -v1208(VarCurr)| -range_3_0(B)| -v1564(VarCurr,B)|v1497(VarCurr,B).
% 94.80/94.15  0 [] -v1208(VarCurr)| -range_3_0(B)|v1564(VarCurr,B)| -v1497(VarCurr,B).
% 94.80/94.15  0 [] -v1563(VarCurr)| -v1204(VarCurr,bitIndex3)|$F.
% 94.80/94.15  0 [] -v1563(VarCurr)|v1204(VarCurr,bitIndex3)| -$F.
% 94.80/94.15  0 [] -v1563(VarCurr)| -v1204(VarCurr,bitIndex2)|$F.
% 94.80/94.15  0 [] -v1563(VarCurr)|v1204(VarCurr,bitIndex2)| -$F.
% 94.80/94.15  0 [] -v1563(VarCurr)| -v1204(VarCurr,bitIndex1)|$F.
% 94.80/94.15  0 [] -v1563(VarCurr)|v1204(VarCurr,bitIndex1)| -$F.
% 94.80/94.15  0 [] -v1563(VarCurr)| -v1204(VarCurr,bitIndex0)|$F.
% 94.80/94.15  0 [] -v1563(VarCurr)|v1204(VarCurr,bitIndex0)| -$F.
% 94.80/94.15  0 [] v1563(VarCurr)|v1204(VarCurr,bitIndex3)|$F|v1204(VarCurr,bitIndex2)|v1204(VarCurr,bitIndex1)|v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] v1563(VarCurr)| -v1204(VarCurr,bitIndex3)| -$F| -v1204(VarCurr,bitIndex2)| -v1204(VarCurr,bitIndex1)| -v1204(VarCurr,bitIndex0).
% 94.80/94.15  0 [] -range_3_0(B)| -v1204(constB0,B)|$F.
% 94.80/94.15  0 [] -range_3_0(B)|v1204(constB0,B)| -$F.
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1545(VarNext)| -v1536(VarNext)|v1536(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1545(VarNext)|v1536(VarNext)| -v1536(VarCurr).
% 94.80/94.15  0 [] -v1545(VarNext)| -v1536(VarNext)|v1555(VarNext).
% 94.80/94.15  0 [] -v1545(VarNext)|v1536(VarNext)| -v1555(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1555(VarNext)|v1553(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1555(VarNext)| -v1553(VarCurr).
% 94.80/94.15  0 [] v1556(VarCurr)| -v1553(VarCurr)|v1538(VarCurr).
% 94.80/94.15  0 [] v1556(VarCurr)|v1553(VarCurr)| -v1538(VarCurr).
% 94.80/94.15  0 [] -v1556(VarCurr)| -v1553(VarCurr)|$F.
% 94.80/94.15  0 [] -v1556(VarCurr)|v1553(VarCurr)| -$F.
% 94.80/94.15  0 [] v1556(VarCurr)|v1198(VarCurr).
% 94.80/94.15  0 [] -v1556(VarCurr)| -v1198(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1545(VarNext)|v1546(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1545(VarNext)| -v1546(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1546(VarNext)|v1547(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1546(VarNext)|v1540(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1546(VarNext)| -v1547(VarNext)| -v1540(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1547(VarNext)|v1549(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1547(VarNext)| -v1549(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1549(VarNext)|v1540(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1549(VarNext)| -v1540(VarCurr).
% 94.80/94.15  0 [] -v1536(constB0)|$F.
% 94.80/94.15  0 [] v1536(constB0)| -$F.
% 94.80/94.15  0 [] -v1540(VarCurr)|v1542(VarCurr).
% 94.80/94.15  0 [] v1540(VarCurr)| -v1542(VarCurr).
% 94.80/94.15  0 [] -v1542(VarCurr)|v1(VarCurr).
% 94.80/94.15  0 [] v1542(VarCurr)| -v1(VarCurr).
% 94.80/94.15  0 [] -v1538(VarCurr)|$F.
% 94.80/94.15  0 [] v1538(VarCurr)| -$F.
% 94.80/94.15  0 [] -range_3_0(B)| -v1497(VarCurr,B)|v1499(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)|v1497(VarCurr,B)| -v1499(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)| -v1499(VarCurr,B)|v1501(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)|v1499(VarCurr,B)| -v1501(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)| -v1501(VarCurr,B)|v1503(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)|v1501(VarCurr,B)| -v1503(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)| -v1503(VarCurr,B)|v1505(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)|v1503(VarCurr,B)| -v1505(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)| -v1505(VarCurr,B)|v1507(VarCurr,B).
% 94.80/94.15  0 [] -range_3_0(B)|v1505(VarCurr,B)| -v1507(VarCurr,B).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1512(VarNext)| -range_3_0(B)| -v1507(VarNext,B)|v1507(VarCurr,B).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1512(VarNext)| -range_3_0(B)|v1507(VarNext,B)| -v1507(VarCurr,B).
% 94.80/94.15  0 [] -v1512(VarNext)| -range_3_0(B)| -v1507(VarNext,B)|v1529(VarNext,B).
% 94.80/94.15  0 [] -v1512(VarNext)| -range_3_0(B)|v1507(VarNext,B)| -v1529(VarNext,B).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v1529(VarNext,B)|v1527(VarCurr,B).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v1529(VarNext,B)| -v1527(VarCurr,B).
% 94.80/94.15  0 [] v1521(VarCurr)| -range_3_0(B)| -v1527(VarCurr,B)|v1530(VarCurr,B).
% 94.80/94.15  0 [] v1521(VarCurr)| -range_3_0(B)|v1527(VarCurr,B)| -v1530(VarCurr,B).
% 94.80/94.15  0 [] -v1521(VarCurr)| -range_3_0(B)| -v1527(VarCurr,B)|$F.
% 94.80/94.15  0 [] -v1521(VarCurr)| -range_3_0(B)|v1527(VarCurr,B)| -$F.
% 94.80/94.15  0 [] v1222(VarCurr,bitIndex3)| -range_3_0(B)| -v1530(VarCurr,B)|b0011(B).
% 94.80/94.15  0 [] v1222(VarCurr,bitIndex3)| -range_3_0(B)|v1530(VarCurr,B)| -b0011(B).
% 94.80/94.15  0 [] -b0011(bitIndex3).
% 94.80/94.15  0 [] -b0011(bitIndex2).
% 94.80/94.15  0 [] b0011(bitIndex1).
% 94.80/94.15  0 [] b0011(bitIndex0).
% 94.80/94.15  0 [] -v1222(VarCurr,bitIndex3)| -range_3_0(B)| -v1530(VarCurr,B)|b1100(B).
% 94.80/94.15  0 [] -v1222(VarCurr,bitIndex3)| -range_3_0(B)|v1530(VarCurr,B)| -b1100(B).
% 94.80/94.15  0 [] b1100(bitIndex3).
% 94.80/94.15  0 [] b1100(bitIndex2).
% 94.80/94.15  0 [] -b1100(bitIndex1).
% 94.80/94.15  0 [] -b1100(bitIndex0).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1512(VarNext)|v1513(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1512(VarNext)|v1520(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1512(VarNext)| -v1513(VarNext)| -v1520(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1520(VarNext)|v1518(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1520(VarNext)| -v1518(VarCurr).
% 94.80/94.15  0 [] -v1518(VarCurr)|v1521(VarCurr)|v1522(VarCurr).
% 94.80/94.15  0 [] v1518(VarCurr)| -v1521(VarCurr).
% 94.80/94.15  0 [] v1518(VarCurr)| -v1522(VarCurr).
% 94.80/94.15  0 [] -v1522(VarCurr)|v1523(VarCurr).
% 94.80/94.15  0 [] -v1522(VarCurr)|v1526(VarCurr).
% 94.80/94.15  0 [] v1522(VarCurr)| -v1523(VarCurr)| -v1526(VarCurr).
% 94.80/94.15  0 [] v1526(VarCurr)|v1521(VarCurr).
% 94.80/94.15  0 [] -v1526(VarCurr)| -v1521(VarCurr).
% 94.80/94.15  0 [] -v1523(VarCurr)|v1222(VarCurr,bitIndex3)|v1524(VarCurr).
% 94.80/94.15  0 [] v1523(VarCurr)| -v1222(VarCurr,bitIndex3).
% 94.80/94.15  0 [] v1523(VarCurr)| -v1524(VarCurr).
% 94.80/94.15  0 [] -v1524(VarCurr)|v1222(VarCurr,bitIndex1).
% 94.80/94.15  0 [] -v1524(VarCurr)|v1525(VarCurr).
% 94.80/94.15  0 [] v1524(VarCurr)| -v1222(VarCurr,bitIndex1)| -v1525(VarCurr).
% 94.80/94.15  0 [] v1525(VarCurr)|v1222(VarCurr,bitIndex3).
% 94.80/94.15  0 [] -v1525(VarCurr)| -v1222(VarCurr,bitIndex3).
% 94.80/94.15  0 [] v1521(VarCurr)|v1220(VarCurr).
% 94.80/94.15  0 [] -v1521(VarCurr)| -v1220(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1513(VarNext)|v1514(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1513(VarNext)|v1402(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1513(VarNext)| -v1514(VarNext)| -v1402(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1514(VarNext)|v1409(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1514(VarNext)| -v1409(VarNext).
% 94.80/94.15  0 [] -v1208(VarCurr)|v1210(VarCurr).
% 94.80/94.15  0 [] v1208(VarCurr)| -v1210(VarCurr).
% 94.80/94.15  0 [] -v1210(VarCurr)|v1212(VarCurr).
% 94.80/94.15  0 [] v1210(VarCurr)| -v1212(VarCurr).
% 94.80/94.15  0 [] -v1212(VarCurr)|v1214(VarCurr).
% 94.80/94.15  0 [] v1212(VarCurr)| -v1214(VarCurr).
% 94.80/94.15  0 [] -v1214(VarCurr)|v1216(VarCurr).
% 94.80/94.15  0 [] v1214(VarCurr)| -v1216(VarCurr).
% 94.80/94.15  0 [] -v1216(VarCurr)|v1218(VarCurr).
% 94.80/94.15  0 [] v1216(VarCurr)| -v1218(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1482(VarNext)| -v1218(VarNext)|v1218(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1482(VarNext)|v1218(VarNext)| -v1218(VarCurr).
% 94.80/94.15  0 [] -v1482(VarNext)| -v1218(VarNext)|v1490(VarNext).
% 94.80/94.15  0 [] -v1482(VarNext)|v1218(VarNext)| -v1490(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1490(VarNext)|v1488(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1490(VarNext)| -v1488(VarCurr).
% 94.80/94.15  0 [] v1491(VarCurr)| -v1488(VarCurr)|v1492(VarCurr).
% 94.80/94.15  0 [] v1491(VarCurr)|v1488(VarCurr)| -v1492(VarCurr).
% 94.80/94.15  0 [] -v1491(VarCurr)| -v1488(VarCurr)|$F.
% 94.80/94.15  0 [] -v1491(VarCurr)|v1488(VarCurr)| -$F.
% 94.80/94.15  0 [] v1493(VarCurr)| -v1492(VarCurr)|$F.
% 94.80/94.15  0 [] v1493(VarCurr)|v1492(VarCurr)| -$F.
% 94.80/94.15  0 [] -v1493(VarCurr)| -v1492(VarCurr)|$T.
% 94.80/94.15  0 [] -v1493(VarCurr)|v1492(VarCurr)| -$T.
% 94.80/94.15  0 [] v1493(VarCurr)|v1222(VarCurr,bitIndex0).
% 94.80/94.15  0 [] -v1493(VarCurr)| -v1222(VarCurr,bitIndex0).
% 94.80/94.15  0 [] v1491(VarCurr)|v1220(VarCurr).
% 94.80/94.15  0 [] -v1491(VarCurr)| -v1220(VarCurr).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1482(VarNext)|v1483(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1482(VarNext)| -v1483(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1483(VarNext)|v1484(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1483(VarNext)|v1402(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1483(VarNext)| -v1484(VarNext)| -v1402(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1484(VarNext)|v1409(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1484(VarNext)| -v1409(VarNext).
% 94.80/94.15  0 [] v1470(VarCurr)| -v1222(VarCurr,bitIndex0)|$F.
% 94.80/94.15  0 [] v1470(VarCurr)|v1222(VarCurr,bitIndex0)| -$F.
% 94.80/94.15  0 [] -v1470(VarCurr)| -v1222(VarCurr,bitIndex0)|$T.
% 94.80/94.15  0 [] -v1470(VarCurr)|v1222(VarCurr,bitIndex0)| -$T.
% 94.80/94.15  0 [] -v1470(VarCurr)|v1471(VarCurr)|v1478(VarCurr).
% 94.80/94.15  0 [] v1470(VarCurr)| -v1471(VarCurr).
% 94.80/94.15  0 [] v1470(VarCurr)| -v1478(VarCurr).
% 94.80/94.15  0 [] -v1478(VarCurr)|v1479(VarCurr).
% 94.80/94.15  0 [] -v1478(VarCurr)|v1400(VarCurr).
% 94.80/94.15  0 [] v1478(VarCurr)| -v1479(VarCurr)| -v1400(VarCurr).
% 94.80/94.15  0 [] -v1479(VarCurr)|v1474(VarCurr).
% 94.80/94.15  0 [] -v1479(VarCurr)|v1186(VarCurr).
% 94.80/94.15  0 [] v1479(VarCurr)| -v1474(VarCurr)| -v1186(VarCurr).
% 94.80/94.15  0 [] -v1471(VarCurr)|v1472(VarCurr)|v1475(VarCurr).
% 94.80/94.15  0 [] v1471(VarCurr)| -v1472(VarCurr).
% 94.80/94.15  0 [] v1471(VarCurr)| -v1475(VarCurr).
% 94.80/94.15  0 [] -v1475(VarCurr)|v1476(VarCurr).
% 94.80/94.15  0 [] -v1475(VarCurr)|v1397(VarCurr).
% 94.80/94.15  0 [] v1475(VarCurr)| -v1476(VarCurr)| -v1397(VarCurr).
% 94.80/94.15  0 [] -v1476(VarCurr)|v1474(VarCurr).
% 94.80/94.15  0 [] -v1476(VarCurr)|v1186(VarCurr).
% 94.80/94.15  0 [] v1476(VarCurr)| -v1474(VarCurr)| -v1186(VarCurr).
% 94.80/94.15  0 [] v1474(VarCurr)|v1224(VarCurr).
% 94.80/94.15  0 [] -v1474(VarCurr)| -v1224(VarCurr).
% 94.80/94.15  0 [] -v1472(VarCurr)|v1473(VarCurr).
% 94.80/94.15  0 [] -v1472(VarCurr)|v1391(VarCurr).
% 94.80/94.15  0 [] v1472(VarCurr)| -v1473(VarCurr)| -v1391(VarCurr).
% 94.80/94.15  0 [] v1473(VarCurr)|v1224(VarCurr).
% 94.80/94.15  0 [] -v1473(VarCurr)| -v1224(VarCurr).
% 94.80/94.15  0 [] -v1377(VarNext,bitIndex2)|v1462(VarNext,bitIndex1).
% 94.80/94.15  0 [] v1377(VarNext,bitIndex2)| -v1462(VarNext,bitIndex1).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)| -v1462(VarNext,bitIndex3)|v1377(VarCurr,bitIndex4).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)|v1462(VarNext,bitIndex3)| -v1377(VarCurr,bitIndex4).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)| -v1462(VarNext,bitIndex2)|v1377(VarCurr,bitIndex3).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)|v1462(VarNext,bitIndex2)| -v1377(VarCurr,bitIndex3).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)| -v1462(VarNext,bitIndex1)|v1377(VarCurr,bitIndex2).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)|v1462(VarNext,bitIndex1)| -v1377(VarCurr,bitIndex2).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)| -v1462(VarNext,bitIndex0)|v1377(VarCurr,bitIndex1).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)|v1462(VarNext,bitIndex0)| -v1377(VarCurr,bitIndex1).
% 94.80/94.15  0 [] -v1463(VarNext)| -range_3_0(B)| -v1462(VarNext,B)|v1415(VarNext,B).
% 94.80/94.15  0 [] -v1463(VarNext)| -range_3_0(B)|v1462(VarNext,B)| -v1415(VarNext,B).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1463(VarNext)|v1464(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1463(VarNext)| -v1464(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1464(VarNext)|v1466(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)| -v1464(VarNext)|v1402(VarNext).
% 94.80/94.15  0 [] -nextState(VarCurr,VarNext)|v1464(VarNext)| -v1466(VarNext)| -v1402(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1466(VarNext)|v1409(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1466(VarNext)| -v1409(VarNext).
% 94.80/94.16  0 [] v1457(VarCurr)| -v1222(VarCurr,bitIndex2)|$F.
% 94.80/94.16  0 [] v1457(VarCurr)|v1222(VarCurr,bitIndex2)| -$F.
% 94.80/94.16  0 [] -v1457(VarCurr)| -v1222(VarCurr,bitIndex2)|$T.
% 94.80/94.16  0 [] -v1457(VarCurr)|v1222(VarCurr,bitIndex2)| -$T.
% 94.80/94.16  0 [] -v1457(VarCurr)|v1458(VarCurr)|v1459(VarCurr).
% 94.80/94.16  0 [] v1457(VarCurr)| -v1458(VarCurr).
% 94.80/94.16  0 [] v1457(VarCurr)| -v1459(VarCurr).
% 94.80/94.16  0 [] -v1459(VarCurr)|v1460(VarCurr).
% 94.80/94.16  0 [] -v1459(VarCurr)|v1397(VarCurr).
% 94.80/94.16  0 [] v1459(VarCurr)| -v1460(VarCurr)| -v1397(VarCurr).
% 94.80/94.16  0 [] v1460(VarCurr)|v1186(VarCurr).
% 94.80/94.16  0 [] -v1460(VarCurr)| -v1186(VarCurr).
% 94.80/94.16  0 [] -v1458(VarCurr)| -$T|v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1458(VarCurr)|$T| -v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1458(VarCurr)|$T|v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1458(VarCurr)| -$T| -v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1377(VarNext,bitIndex1)|v1449(VarNext,bitIndex0).
% 94.80/94.16  0 [] v1377(VarNext,bitIndex1)| -v1449(VarNext,bitIndex0).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)| -v1449(VarNext,bitIndex3)|v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)|v1449(VarNext,bitIndex3)| -v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)| -v1449(VarNext,bitIndex2)|v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)|v1449(VarNext,bitIndex2)| -v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)| -v1449(VarNext,bitIndex1)|v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)|v1449(VarNext,bitIndex1)| -v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)| -v1449(VarNext,bitIndex0)|v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)|v1449(VarNext,bitIndex0)| -v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1450(VarNext)| -range_3_0(B)| -v1449(VarNext,B)|v1415(VarNext,B).
% 94.80/94.16  0 [] -v1450(VarNext)| -range_3_0(B)|v1449(VarNext,B)| -v1415(VarNext,B).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1450(VarNext)|v1451(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1450(VarNext)| -v1451(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1451(VarNext)|v1453(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1451(VarNext)|v1402(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1451(VarNext)| -v1453(VarNext)| -v1402(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1453(VarNext)|v1409(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1453(VarNext)| -v1409(VarNext).
% 94.80/94.16  0 [] v1435(VarCurr)| -v1222(VarCurr,bitIndex1)|$F.
% 94.80/94.16  0 [] v1435(VarCurr)|v1222(VarCurr,bitIndex1)| -$F.
% 94.80/94.16  0 [] -v1435(VarCurr)| -v1222(VarCurr,bitIndex1)|$T.
% 94.80/94.16  0 [] -v1435(VarCurr)|v1222(VarCurr,bitIndex1)| -$T.
% 94.80/94.16  0 [] -v1435(VarCurr)|v1436(VarCurr)|v1446(VarCurr).
% 94.80/94.16  0 [] v1435(VarCurr)| -v1436(VarCurr).
% 94.80/94.16  0 [] v1435(VarCurr)| -v1446(VarCurr).
% 94.80/94.16  0 [] -v1446(VarCurr)|v1447(VarCurr).
% 94.80/94.16  0 [] -v1446(VarCurr)|v1400(VarCurr).
% 94.80/94.16  0 [] v1446(VarCurr)| -v1447(VarCurr)| -v1400(VarCurr).
% 94.80/94.16  0 [] -v1447(VarCurr)|v1445(VarCurr).
% 94.80/94.16  0 [] -v1447(VarCurr)|v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1447(VarCurr)| -v1445(VarCurr)| -v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1436(VarCurr)|v1437(VarCurr)|v1442(VarCurr).
% 94.80/94.16  0 [] v1436(VarCurr)| -v1437(VarCurr).
% 94.80/94.16  0 [] v1436(VarCurr)| -v1442(VarCurr).
% 94.80/94.16  0 [] -v1442(VarCurr)|v1443(VarCurr).
% 94.80/94.16  0 [] -v1442(VarCurr)|v1397(VarCurr).
% 94.80/94.16  0 [] v1442(VarCurr)| -v1443(VarCurr)| -v1397(VarCurr).
% 94.80/94.16  0 [] -v1443(VarCurr)|v1445(VarCurr).
% 94.80/94.16  0 [] -v1443(VarCurr)|v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1443(VarCurr)| -v1445(VarCurr)| -v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1445(VarCurr)|v1396(VarCurr).
% 94.80/94.16  0 [] -v1445(VarCurr)|v1441(VarCurr).
% 94.80/94.16  0 [] v1445(VarCurr)| -v1396(VarCurr)| -v1441(VarCurr).
% 94.80/94.16  0 [] -v1437(VarCurr)|v1438(VarCurr).
% 94.80/94.16  0 [] -v1437(VarCurr)|v1391(VarCurr).
% 94.80/94.16  0 [] v1437(VarCurr)| -v1438(VarCurr)| -v1391(VarCurr).
% 94.80/94.16  0 [] -v1438(VarCurr)|v1440(VarCurr).
% 94.80/94.16  0 [] -v1438(VarCurr)|v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1438(VarCurr)| -v1440(VarCurr)| -v1368(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1440(VarCurr)|v1224(VarCurr).
% 94.80/94.16  0 [] -v1440(VarCurr)|v1441(VarCurr).
% 94.80/94.16  0 [] v1440(VarCurr)| -v1224(VarCurr)| -v1441(VarCurr).
% 94.80/94.16  0 [] v1441(VarCurr)|v1368(VarCurr,bitIndex0).
% 94.80/94.16  0 [] -v1441(VarCurr)| -v1368(VarCurr,bitIndex0).
% 94.80/94.16  0 [] -v1377(VarNext,bitIndex4)|v1427(VarNext,bitIndex3).
% 94.80/94.16  0 [] v1377(VarNext,bitIndex4)| -v1427(VarNext,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)| -v1427(VarNext,bitIndex3)|v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)|v1427(VarNext,bitIndex3)| -v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)| -v1427(VarNext,bitIndex2)|v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)|v1427(VarNext,bitIndex2)| -v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)| -v1427(VarNext,bitIndex1)|v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)|v1427(VarNext,bitIndex1)| -v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)| -v1427(VarNext,bitIndex0)|v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)|v1427(VarNext,bitIndex0)| -v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1428(VarNext)| -range_3_0(B)| -v1427(VarNext,B)|v1415(VarNext,B).
% 94.80/94.16  0 [] -v1428(VarNext)| -range_3_0(B)|v1427(VarNext,B)| -v1415(VarNext,B).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1428(VarNext)|v1429(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1428(VarNext)| -v1429(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1429(VarNext)|v1431(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1429(VarNext)|v1402(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1429(VarNext)| -v1431(VarNext)| -v1402(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1431(VarNext)|v1409(VarNext).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -v1431(VarNext)| -v1409(VarNext).
% 94.80/94.16  0 [] v1421(VarCurr)| -v1222(VarCurr,bitIndex4)|$F.
% 94.80/94.16  0 [] v1421(VarCurr)|v1222(VarCurr,bitIndex4)| -$F.
% 94.80/94.16  0 [] -v1421(VarCurr)| -v1222(VarCurr,bitIndex4)|$T.
% 94.80/94.16  0 [] -v1421(VarCurr)|v1222(VarCurr,bitIndex4)| -$T.
% 94.80/94.16  0 [] -v1421(VarCurr)|v1422(VarCurr)|v1423(VarCurr).
% 94.80/94.16  0 [] v1421(VarCurr)| -v1422(VarCurr).
% 94.80/94.16  0 [] v1421(VarCurr)| -v1423(VarCurr).
% 94.80/94.16  0 [] -v1423(VarCurr)|v1424(VarCurr).
% 94.80/94.16  0 [] -v1423(VarCurr)|v1400(VarCurr).
% 94.80/94.16  0 [] v1423(VarCurr)| -v1424(VarCurr)| -v1400(VarCurr).
% 94.80/94.16  0 [] v1424(VarCurr)|v1186(VarCurr).
% 94.80/94.16  0 [] -v1424(VarCurr)| -v1186(VarCurr).
% 94.80/94.16  0 [] -v1422(VarCurr)| -$T|v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -v1422(VarCurr)|$T| -v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] v1422(VarCurr)|$T|v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] v1422(VarCurr)| -$T| -v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -v1377(VarNext,bitIndex3)|v1404(VarNext,bitIndex2).
% 94.80/94.16  0 [] v1377(VarNext,bitIndex3)| -v1404(VarNext,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)| -v1404(VarNext,bitIndex3)|v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)|v1404(VarNext,bitIndex3)| -v1377(VarCurr,bitIndex4).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)| -v1404(VarNext,bitIndex2)|v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)|v1404(VarNext,bitIndex2)| -v1377(VarCurr,bitIndex3).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)| -v1404(VarNext,bitIndex1)|v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)|v1404(VarNext,bitIndex1)| -v1377(VarCurr,bitIndex2).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)| -v1404(VarNext,bitIndex0)|v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)|v1404(VarNext,bitIndex0)| -v1377(VarCurr,bitIndex1).
% 94.80/94.16  0 [] -v1405(VarNext)| -range_3_0(B)| -v1404(VarNext,B)|v1415(VarNext,B).
% 94.80/94.16  0 [] -v1405(VarNext)| -range_3_0(B)|v1404(VarNext,B)| -v1415(VarNext,B).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v1415(VarNext,B)|v1413(VarCurr,B).
% 94.80/94.16  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v1415(VarNext,B)| -v1413(VarCurr,B).
% 94.80/94.16  0 [] v1416(VarCurr)| -v1413(VarCurr,bitIndex3)|v1222(VarCurr,bitIndex4).
% 94.80/94.16  0 [] v1416(VarCurr)|v1413(VarCurr,bitIndex3)| -v1222(VarCurr,bitIndex4).
% 94.80/94.16  0 [] v1416(VarCurr)| -v1413(VarCurr,bitIndex2)|v1222(VarCurr,bitIndex3).
% 94.80/94.16  0 [] v1416(VarCurr)|v1413(VarCurr,bitIndex2)| -v1222(VarCurr,bitIndex3).
% 94.80/94.16  0 [] v1416(VarCurr)| -v1413(VarCurr,bitIndex1)|v1222(VarCurr,bitIndex2).
% 94.80/94.16  0 [] v1416(VarCurr)|v1413(VarCurr,bitIndex1)| -v1222(VarCurr,bitIndex2).
% 94.80/94.16  0 [] v1416(VarCurr)| -v1413(VarCurr,bitIndex0)|v1222(VarCurr,bitIndex1).
% 94.80/94.16  0 [] v1416(VarCurr)|v1413(VarCurr,bitIndex0)| -v1222(VarCurr,bitIndex1).
% 94.80/94.17  0 [] -v1416(VarCurr)| -range_3_0(B)| -v1413(VarCurr,B)|$F.
% 94.80/94.17  0 [] -v1416(VarCurr)| -range_3_0(B)|v1413(VarCurr,B)| -$F.
% 94.80/94.17  0 [] v1416(VarCurr)|v1220(VarCurr).
% 94.80/94.17  0 [] -v1416(VarCurr)| -v1220(VarCurr).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1405(VarNext)|v1406(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1405(VarNext)| -v1406(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1406(VarNext)|v1407(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1406(VarNext)|v1402(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1406(VarNext)| -v1407(VarNext)| -v1402(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1407(VarNext)|v1409(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1407(VarNext)| -v1409(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1409(VarNext)|v1402(VarCurr).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1409(VarNext)| -v1402(VarCurr).
% 94.80/94.17  0 [] -v1402(VarCurr)|v288(VarCurr).
% 94.80/94.17  0 [] v1402(VarCurr)| -v288(VarCurr).
% 94.80/94.17  0 [] v1384(VarCurr)| -v1222(VarCurr,bitIndex3)|$F.
% 94.80/94.17  0 [] v1384(VarCurr)|v1222(VarCurr,bitIndex3)| -$F.
% 94.80/94.17  0 [] -v1384(VarCurr)| -v1222(VarCurr,bitIndex3)|$T.
% 94.80/94.17  0 [] -v1384(VarCurr)|v1222(VarCurr,bitIndex3)| -$T.
% 94.80/94.17  0 [] -v1384(VarCurr)|v1385(VarCurr)|v1398(VarCurr).
% 94.80/94.17  0 [] v1384(VarCurr)| -v1385(VarCurr).
% 94.80/94.17  0 [] v1384(VarCurr)| -v1398(VarCurr).
% 94.80/94.17  0 [] -v1398(VarCurr)|v1399(VarCurr).
% 94.80/94.17  0 [] -v1398(VarCurr)|v1400(VarCurr).
% 94.80/94.17  0 [] v1398(VarCurr)| -v1399(VarCurr)| -v1400(VarCurr).
% 94.80/94.17  0 [] -v1400(VarCurr)| -$T|v1377(VarCurr,bitIndex4).
% 94.80/94.17  0 [] -v1400(VarCurr)|$T| -v1377(VarCurr,bitIndex4).
% 94.80/94.17  0 [] v1400(VarCurr)|$T|v1377(VarCurr,bitIndex4).
% 94.80/94.17  0 [] v1400(VarCurr)| -$T| -v1377(VarCurr,bitIndex4).
% 94.80/94.17  0 [] -v1399(VarCurr)|v1395(VarCurr).
% 94.80/94.17  0 [] -v1399(VarCurr)|v1390(VarCurr).
% 94.80/94.17  0 [] v1399(VarCurr)| -v1395(VarCurr)| -v1390(VarCurr).
% 94.80/94.17  0 [] -v1385(VarCurr)|v1386(VarCurr)|v1392(VarCurr).
% 94.80/94.17  0 [] v1385(VarCurr)| -v1386(VarCurr).
% 94.80/94.17  0 [] v1385(VarCurr)| -v1392(VarCurr).
% 94.80/94.17  0 [] -v1392(VarCurr)|v1393(VarCurr).
% 94.80/94.17  0 [] -v1392(VarCurr)|v1397(VarCurr).
% 94.80/94.17  0 [] v1392(VarCurr)| -v1393(VarCurr)| -v1397(VarCurr).
% 94.80/94.17  0 [] -v1397(VarCurr)| -$T|v1377(VarCurr,bitIndex2).
% 94.80/94.17  0 [] -v1397(VarCurr)|$T| -v1377(VarCurr,bitIndex2).
% 94.80/94.17  0 [] v1397(VarCurr)|$T|v1377(VarCurr,bitIndex2).
% 94.80/94.17  0 [] v1397(VarCurr)| -$T| -v1377(VarCurr,bitIndex2).
% 94.80/94.17  0 [] -v1393(VarCurr)|v1395(VarCurr).
% 94.80/94.17  0 [] -v1393(VarCurr)|v1390(VarCurr).
% 94.80/94.17  0 [] v1393(VarCurr)| -v1395(VarCurr)| -v1390(VarCurr).
% 94.80/94.17  0 [] -v1395(VarCurr)|v1396(VarCurr).
% 94.80/94.17  0 [] -v1395(VarCurr)|v1368(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1395(VarCurr)| -v1396(VarCurr)| -v1368(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -v1396(VarCurr)|v1224(VarCurr).
% 94.80/94.17  0 [] -v1396(VarCurr)|v1186(VarCurr).
% 94.80/94.17  0 [] v1396(VarCurr)| -v1224(VarCurr)| -v1186(VarCurr).
% 94.80/94.17  0 [] -v1386(VarCurr)|v1387(VarCurr).
% 94.80/94.17  0 [] -v1386(VarCurr)|v1391(VarCurr).
% 94.80/94.17  0 [] v1386(VarCurr)| -v1387(VarCurr)| -v1391(VarCurr).
% 94.80/94.17  0 [] -v1391(VarCurr)| -$T|v1377(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -v1391(VarCurr)|$T| -v1377(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1391(VarCurr)|$T|v1377(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1391(VarCurr)| -$T| -v1377(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -v1377(constB0,bitIndex4)|$F.
% 94.80/94.17  0 [] v1377(constB0,bitIndex4)| -$F.
% 94.80/94.17  0 [] -v1377(constB0,bitIndex3)|$F.
% 94.80/94.17  0 [] v1377(constB0,bitIndex3)| -$F.
% 94.80/94.17  0 [] -v1377(constB0,bitIndex2)|$F.
% 94.80/94.17  0 [] v1377(constB0,bitIndex2)| -$F.
% 94.80/94.17  0 [] -v1377(constB0,bitIndex1)|$F.
% 94.80/94.17  0 [] v1377(constB0,bitIndex1)| -$F.
% 94.80/94.17  0 [] -v1387(VarCurr)|v1389(VarCurr).
% 94.80/94.17  0 [] -v1387(VarCurr)|v1390(VarCurr).
% 94.80/94.17  0 [] v1387(VarCurr)| -v1389(VarCurr)| -v1390(VarCurr).
% 94.80/94.17  0 [] v1390(VarCurr)|v1368(VarCurr,bitIndex1).
% 94.80/94.17  0 [] -v1390(VarCurr)| -v1368(VarCurr,bitIndex1).
% 94.80/94.17  0 [] -v1389(VarCurr)|v1224(VarCurr).
% 94.80/94.17  0 [] -v1389(VarCurr)|v1368(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1389(VarCurr)| -v1224(VarCurr)| -v1368(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -range_1_0(B)| -v1368(VarCurr,B)|v1370(VarCurr,B).
% 94.80/94.17  0 [] -range_1_0(B)|v1368(VarCurr,B)| -v1370(VarCurr,B).
% 94.80/94.17  0 [] -range_1_0(B)| -v1370(VarCurr,B)|v1372(VarCurr,B).
% 94.80/94.17  0 [] -range_1_0(B)|v1370(VarCurr,B)| -v1372(VarCurr,B).
% 94.80/94.17  0 [] -v1372(VarCurr,bitIndex0)|v36(VarCurr,bitIndex4).
% 94.80/94.17  0 [] v1372(VarCurr,bitIndex0)| -v36(VarCurr,bitIndex4).
% 94.80/94.17  0 [] -v1372(VarCurr,bitIndex1)|v1374(VarCurr).
% 94.80/94.17  0 [] v1372(VarCurr,bitIndex1)| -v1374(VarCurr).
% 94.80/94.17  0 [] -v1374(VarCurr)|v36(VarCurr,bitIndex1)|v36(VarCurr,bitIndex7).
% 94.80/94.17  0 [] v1374(VarCurr)| -v36(VarCurr,bitIndex1).
% 94.80/94.17  0 [] v1374(VarCurr)| -v36(VarCurr,bitIndex7).
% 94.80/94.17  0 [] -v1224(VarCurr)|v1226(VarCurr).
% 94.80/94.17  0 [] v1224(VarCurr)| -v1226(VarCurr).
% 94.80/94.17  0 [] -v1226(VarCurr)|v1228(VarCurr).
% 94.80/94.17  0 [] v1226(VarCurr)| -v1228(VarCurr).
% 94.80/94.17  0 [] -v1228(VarCurr)|v1366(VarCurr)|v36(VarCurr,bitIndex7).
% 94.80/94.17  0 [] v1228(VarCurr)| -v1366(VarCurr).
% 94.80/94.17  0 [] v1228(VarCurr)| -v36(VarCurr,bitIndex7).
% 94.80/94.17  0 [] -v1366(VarCurr)|v36(VarCurr,bitIndex1)|v36(VarCurr,bitIndex4).
% 94.80/94.17  0 [] v1366(VarCurr)| -v36(VarCurr,bitIndex1).
% 94.80/94.17  0 [] v1366(VarCurr)| -v36(VarCurr,bitIndex4).
% 94.80/94.17  0 [] v1333(VarCurr)| -v36(VarCurr,bitIndex4)|$F.
% 94.80/94.17  0 [] v1333(VarCurr)|v36(VarCurr,bitIndex4)| -$F.
% 94.80/94.17  0 [] -v1333(VarCurr)| -v36(VarCurr,bitIndex4)|$T.
% 94.80/94.17  0 [] -v1333(VarCurr)|v36(VarCurr,bitIndex4)| -$T.
% 94.80/94.17  0 [] -v1333(VarCurr)|v1334(VarCurr)|v1363(VarCurr).
% 94.80/94.17  0 [] v1333(VarCurr)| -v1334(VarCurr).
% 94.80/94.17  0 [] v1333(VarCurr)| -v1363(VarCurr).
% 94.80/94.17  0 [] -v1363(VarCurr)|v1364(VarCurr).
% 94.80/94.17  0 [] -v1363(VarCurr)|v1323(VarCurr).
% 94.80/94.17  0 [] v1363(VarCurr)| -v1364(VarCurr)| -v1323(VarCurr).
% 94.80/94.17  0 [] -v1364(VarCurr)|v1342(VarCurr).
% 94.80/94.17  0 [] -v1364(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1364(VarCurr)| -v1342(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1334(VarCurr)|v1335(VarCurr)|v1361(VarCurr).
% 94.80/94.17  0 [] v1334(VarCurr)| -v1335(VarCurr).
% 94.80/94.17  0 [] v1334(VarCurr)| -v1361(VarCurr).
% 94.80/94.17  0 [] -v1361(VarCurr)|v1362(VarCurr).
% 94.80/94.17  0 [] -v1361(VarCurr)|v1300(VarCurr).
% 94.80/94.17  0 [] v1361(VarCurr)| -v1362(VarCurr)| -v1300(VarCurr).
% 94.80/94.17  0 [] -v1362(VarCurr)|v1352(VarCurr).
% 94.80/94.17  0 [] -v1362(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1362(VarCurr)| -v1352(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1335(VarCurr)|v1336(VarCurr)|v1358(VarCurr).
% 94.80/94.17  0 [] v1335(VarCurr)| -v1336(VarCurr).
% 94.80/94.17  0 [] v1335(VarCurr)| -v1358(VarCurr).
% 94.80/94.17  0 [] -v1358(VarCurr)|v1359(VarCurr).
% 94.80/94.17  0 [] -v1358(VarCurr)|v1360(VarCurr).
% 94.80/94.17  0 [] v1358(VarCurr)| -v1359(VarCurr)| -v1360(VarCurr).
% 94.80/94.17  0 [] -v1360(VarCurr)| -$T|v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] -v1360(VarCurr)|$T| -v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] v1360(VarCurr)|$T|v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] v1360(VarCurr)| -$T| -v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] -v1359(VarCurr)|v1342(VarCurr).
% 94.80/94.17  0 [] -v1359(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1359(VarCurr)| -v1342(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1336(VarCurr)|v1337(VarCurr)|v1356(VarCurr).
% 94.80/94.17  0 [] v1336(VarCurr)| -v1337(VarCurr).
% 94.80/94.17  0 [] v1336(VarCurr)| -v1356(VarCurr).
% 94.80/94.17  0 [] -v1356(VarCurr)|v1357(VarCurr).
% 94.80/94.17  0 [] -v1356(VarCurr)|v1278(VarCurr).
% 94.80/94.17  0 [] v1356(VarCurr)| -v1357(VarCurr)| -v1278(VarCurr).
% 94.80/94.17  0 [] -v1357(VarCurr)|v1352(VarCurr).
% 94.80/94.17  0 [] -v1357(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1357(VarCurr)| -v1352(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1337(VarCurr)|v1338(VarCurr)|v1353(VarCurr).
% 94.80/94.17  0 [] v1337(VarCurr)| -v1338(VarCurr).
% 94.80/94.17  0 [] v1337(VarCurr)| -v1353(VarCurr).
% 94.80/94.17  0 [] -v1353(VarCurr)|v1354(VarCurr).
% 94.80/94.17  0 [] -v1353(VarCurr)|v1355(VarCurr).
% 94.80/94.17  0 [] v1353(VarCurr)| -v1354(VarCurr)| -v1355(VarCurr).
% 94.80/94.17  0 [] -v1355(VarCurr)| -$T|v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] -v1355(VarCurr)|$T| -v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] v1355(VarCurr)|$T|v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] v1355(VarCurr)| -$T| -v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] -v1354(VarCurr)|v1342(VarCurr).
% 94.80/94.17  0 [] -v1354(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1354(VarCurr)| -v1342(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1338(VarCurr)|v1339(VarCurr)|v1349(VarCurr).
% 94.80/94.17  0 [] v1338(VarCurr)| -v1339(VarCurr).
% 94.80/94.17  0 [] v1338(VarCurr)| -v1349(VarCurr).
% 94.80/94.17  0 [] -v1349(VarCurr)|v1350(VarCurr).
% 94.80/94.17  0 [] -v1349(VarCurr)|v1238(VarCurr).
% 94.80/94.17  0 [] v1349(VarCurr)| -v1350(VarCurr)| -v1238(VarCurr).
% 94.80/94.17  0 [] -v1350(VarCurr)|v1352(VarCurr).
% 94.80/94.17  0 [] -v1350(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1350(VarCurr)| -v1352(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1352(VarCurr)|v1342(VarCurr).
% 94.80/94.17  0 [] -v1352(VarCurr)|v1180(VarCurr).
% 94.80/94.17  0 [] v1352(VarCurr)| -v1342(VarCurr)| -v1180(VarCurr).
% 94.80/94.17  0 [] -v1339(VarCurr)|v1340(VarCurr).
% 94.80/94.17  0 [] -v1339(VarCurr)|v1348(VarCurr).
% 94.80/94.17  0 [] v1339(VarCurr)| -v1340(VarCurr)| -v1348(VarCurr).
% 94.80/94.17  0 [] -v1348(VarCurr)| -$T|v31(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -v1348(VarCurr)|$T| -v31(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1348(VarCurr)|$T|v31(VarCurr,bitIndex0).
% 94.80/94.17  0 [] v1348(VarCurr)| -$T| -v31(VarCurr,bitIndex0).
% 94.80/94.17  0 [] -v1340(VarCurr)|v1342(VarCurr).
% 94.80/94.17  0 [] -v1340(VarCurr)|v1168(VarCurr).
% 94.80/94.17  0 [] v1340(VarCurr)| -v1342(VarCurr)| -v1168(VarCurr).
% 94.80/94.17  0 [] -v1342(VarCurr)|v1343(VarCurr).
% 94.80/94.17  0 [] -v1342(VarCurr)|v1347(VarCurr).
% 94.80/94.17  0 [] v1342(VarCurr)| -v1343(VarCurr)| -v1347(VarCurr).
% 94.80/94.17  0 [] v1347(VarCurr)|v38(VarCurr).
% 94.80/94.17  0 [] -v1347(VarCurr)| -v38(VarCurr).
% 94.80/94.17  0 [] -v1343(VarCurr)|v1344(VarCurr).
% 94.80/94.17  0 [] -v1343(VarCurr)|v1346(VarCurr).
% 94.80/94.17  0 [] v1343(VarCurr)| -v1344(VarCurr)| -v1346(VarCurr).
% 94.80/94.17  0 [] v1346(VarCurr)|v903(VarCurr).
% 94.80/94.17  0 [] -v1346(VarCurr)| -v903(VarCurr).
% 94.80/94.17  0 [] -v1344(VarCurr)|v1345(VarCurr).
% 94.80/94.17  0 [] -v1344(VarCurr)|v881(VarCurr).
% 94.80/94.17  0 [] v1344(VarCurr)| -v1345(VarCurr)| -v881(VarCurr).
% 94.80/94.17  0 [] v1345(VarCurr)|v87(VarCurr).
% 94.80/94.17  0 [] -v1345(VarCurr)| -v87(VarCurr).
% 94.80/94.17  0 [] -v31(VarNext,bitIndex9)|v1325(VarNext,bitIndex8).
% 94.80/94.17  0 [] v31(VarNext,bitIndex9)| -v1325(VarNext,bitIndex8).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1325(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)|v1325(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.80/94.17  0 [] -v1326(VarNext)| -range_10_0(B)| -v1325(VarNext,B)|v1253(VarNext,B).
% 94.80/94.17  0 [] -v1326(VarNext)| -range_10_0(B)|v1325(VarNext,B)| -v1253(VarNext,B).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1326(VarNext)|v1327(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1326(VarNext)| -v1327(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1327(VarNext)|v1329(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1327(VarNext)|v1240(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1327(VarNext)| -v1329(VarNext)| -v1240(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)|v1329(VarNext)|v1247(VarNext).
% 94.80/94.17  0 [] -nextState(VarCurr,VarNext)| -v1329(VarNext)| -v1247(VarNext).
% 94.80/94.17  0 [] v1311(VarCurr)| -v36(VarCurr,bitIndex9)|$F.
% 94.80/94.17  0 [] v1311(VarCurr)|v36(VarCurr,bitIndex9)| -$F.
% 94.80/94.17  0 [] -v1311(VarCurr)| -v36(VarCurr,bitIndex9)|$T.
% 94.80/94.17  0 [] -v1311(VarCurr)|v36(VarCurr,bitIndex9)| -$T.
% 94.80/94.17  0 [] -v1311(VarCurr)|v1312(VarCurr)|v1321(VarCurr).
% 94.80/94.17  0 [] v1311(VarCurr)| -v1312(VarCurr).
% 94.80/94.17  0 [] v1311(VarCurr)| -v1321(VarCurr).
% 94.80/94.17  0 [] -v1321(VarCurr)|v1322(VarCurr).
% 94.80/94.17  0 [] -v1321(VarCurr)|v1323(VarCurr).
% 94.80/94.17  0 [] v1321(VarCurr)| -v1322(VarCurr)| -v1323(VarCurr).
% 94.80/94.17  0 [] -v1323(VarCurr)| -$T|v31(VarCurr,bitIndex9).
% 94.80/94.17  0 [] -v1323(VarCurr)|$T| -v31(VarCurr,bitIndex9).
% 94.80/94.17  0 [] v1323(VarCurr)|$T|v31(VarCurr,bitIndex9).
% 94.80/94.17  0 [] v1323(VarCurr)| -$T| -v31(VarCurr,bitIndex9).
% 94.80/94.18  0 [] -v1322(VarCurr)|v38(VarCurr).
% 94.80/94.18  0 [] v1322(VarCurr)| -v38(VarCurr).
% 94.80/94.18  0 [] -v1312(VarCurr)|v1313(VarCurr)|v1319(VarCurr).
% 94.80/94.18  0 [] v1312(VarCurr)| -v1313(VarCurr).
% 94.80/94.18  0 [] v1312(VarCurr)| -v1319(VarCurr).
% 94.80/94.18  0 [] -v1319(VarCurr)|v1320(VarCurr).
% 94.80/94.18  0 [] -v1319(VarCurr)|v1300(VarCurr).
% 94.80/94.18  0 [] v1319(VarCurr)| -v1320(VarCurr)| -v1300(VarCurr).
% 94.80/94.18  0 [] -v1320(VarCurr)|v38(VarCurr).
% 94.80/94.18  0 [] -v1320(VarCurr)|v1180(VarCurr).
% 94.80/94.18  0 [] v1320(VarCurr)| -v38(VarCurr)| -v1180(VarCurr).
% 94.80/94.18  0 [] -v1313(VarCurr)|v1314(VarCurr)|v1317(VarCurr).
% 94.80/94.18  0 [] v1313(VarCurr)| -v1314(VarCurr).
% 94.80/94.18  0 [] v1313(VarCurr)| -v1317(VarCurr).
% 94.80/94.18  0 [] -v1317(VarCurr)|v1318(VarCurr).
% 94.80/94.18  0 [] -v1317(VarCurr)|v1278(VarCurr).
% 94.80/94.18  0 [] v1317(VarCurr)| -v1318(VarCurr)| -v1278(VarCurr).
% 94.80/94.18  0 [] -v1318(VarCurr)|v38(VarCurr).
% 94.80/94.18  0 [] -v1318(VarCurr)|v1180(VarCurr).
% 94.80/94.18  0 [] v1318(VarCurr)| -v38(VarCurr)| -v1180(VarCurr).
% 94.80/94.18  0 [] -v1314(VarCurr)|v1315(VarCurr).
% 94.80/94.18  0 [] -v1314(VarCurr)|v1238(VarCurr).
% 94.80/94.18  0 [] v1314(VarCurr)| -v1315(VarCurr)| -v1238(VarCurr).
% 94.80/94.18  0 [] -v1315(VarCurr)|v38(VarCurr).
% 94.80/94.18  0 [] -v1315(VarCurr)|v1180(VarCurr).
% 94.80/94.18  0 [] v1315(VarCurr)| -v38(VarCurr)| -v1180(VarCurr).
% 94.80/94.18  0 [] -v31(VarNext,bitIndex8)|v1302(VarNext,bitIndex7).
% 94.80/94.18  0 [] v31(VarNext,bitIndex8)| -v1302(VarNext,bitIndex7).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1302(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)|v1302(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.80/94.18  0 [] -v1303(VarNext)| -range_10_0(B)| -v1302(VarNext,B)|v1253(VarNext,B).
% 94.80/94.18  0 [] -v1303(VarNext)| -range_10_0(B)|v1302(VarNext,B)| -v1253(VarNext,B).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1303(VarNext)|v1304(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1303(VarNext)| -v1304(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1304(VarNext)|v1306(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1304(VarNext)|v1240(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1304(VarNext)| -v1306(VarNext)| -v1240(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1306(VarNext)|v1247(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1306(VarNext)| -v1247(VarNext).
% 94.80/94.18  0 [] v1296(VarCurr)| -v36(VarCurr,bitIndex8)|$F.
% 94.80/94.18  0 [] v1296(VarCurr)|v36(VarCurr,bitIndex8)| -$F.
% 94.80/94.18  0 [] -v1296(VarCurr)| -v36(VarCurr,bitIndex8)|$T.
% 94.80/94.18  0 [] -v1296(VarCurr)|v36(VarCurr,bitIndex8)| -$T.
% 94.80/94.18  0 [] -v1296(VarCurr)|v1297(VarCurr)|v1298(VarCurr).
% 94.80/94.18  0 [] v1296(VarCurr)| -v1297(VarCurr).
% 94.80/94.18  0 [] v1296(VarCurr)| -v1298(VarCurr).
% 94.80/94.18  0 [] -v1298(VarCurr)|v1299(VarCurr).
% 94.80/94.18  0 [] -v1298(VarCurr)|v1300(VarCurr).
% 94.80/94.18  0 [] v1298(VarCurr)| -v1299(VarCurr)| -v1300(VarCurr).
% 94.80/94.18  0 [] -v1300(VarCurr)| -$T|v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] -v1300(VarCurr)|$T| -v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] v1300(VarCurr)|$T|v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] v1300(VarCurr)| -$T| -v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] v1299(VarCurr)|v1180(VarCurr).
% 94.80/94.18  0 [] -v1299(VarCurr)| -v1180(VarCurr).
% 94.80/94.18  0 [] -v1297(VarCurr)| -$T|v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -v1297(VarCurr)|$T| -v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] v1297(VarCurr)|$T|v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] v1297(VarCurr)| -$T| -v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -v31(VarNext,bitIndex6)|v1288(VarNext,bitIndex5).
% 94.80/94.18  0 [] v31(VarNext,bitIndex6)| -v1288(VarNext,bitIndex5).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1288(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)|v1288(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.80/94.18  0 [] -v1289(VarNext)| -range_10_0(B)| -v1288(VarNext,B)|v1253(VarNext,B).
% 94.80/94.18  0 [] -v1289(VarNext)| -range_10_0(B)|v1288(VarNext,B)| -v1253(VarNext,B).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1289(VarNext)|v1290(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1289(VarNext)| -v1290(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1290(VarNext)|v1292(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1290(VarNext)|v1240(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1290(VarNext)| -v1292(VarNext)| -v1240(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1292(VarNext)|v1247(VarNext).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)| -v1292(VarNext)| -v1247(VarNext).
% 94.80/94.18  0 [] -v31(VarNext,bitIndex5)|v1280(VarNext,bitIndex4).
% 94.80/94.18  0 [] v31(VarNext,bitIndex5)| -v1280(VarNext,bitIndex4).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.80/94.18  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1280(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)|v1280(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -v1281(VarNext)| -range_10_0(B)| -v1280(VarNext,B)|v1253(VarNext,B).
% 94.85/94.19  0 [] -v1281(VarNext)| -range_10_0(B)|v1280(VarNext,B)| -v1253(VarNext,B).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1281(VarNext)|v1282(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1281(VarNext)| -v1282(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1282(VarNext)|v1284(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1282(VarNext)|v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1282(VarNext)| -v1284(VarNext)| -v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1284(VarNext)|v1247(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1284(VarNext)| -v1247(VarNext).
% 94.85/94.19  0 [] v1274(VarCurr)| -v36(VarCurr,bitIndex5)|$F.
% 94.85/94.19  0 [] v1274(VarCurr)|v36(VarCurr,bitIndex5)| -$F.
% 94.85/94.19  0 [] -v1274(VarCurr)| -v36(VarCurr,bitIndex5)|$T.
% 94.85/94.19  0 [] -v1274(VarCurr)|v36(VarCurr,bitIndex5)| -$T.
% 94.85/94.19  0 [] -v1274(VarCurr)|v1275(VarCurr)|v1276(VarCurr).
% 94.85/94.19  0 [] v1274(VarCurr)| -v1275(VarCurr).
% 94.85/94.19  0 [] v1274(VarCurr)| -v1276(VarCurr).
% 94.85/94.19  0 [] -v1276(VarCurr)|v1277(VarCurr).
% 94.85/94.19  0 [] -v1276(VarCurr)|v1278(VarCurr).
% 94.85/94.19  0 [] v1276(VarCurr)| -v1277(VarCurr)| -v1278(VarCurr).
% 94.85/94.19  0 [] -v1278(VarCurr)| -$T|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -v1278(VarCurr)|$T| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] v1278(VarCurr)|$T|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] v1278(VarCurr)| -$T| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] v1277(VarCurr)|v1180(VarCurr).
% 94.85/94.19  0 [] -v1277(VarCurr)| -v1180(VarCurr).
% 94.85/94.19  0 [] -v1275(VarCurr)| -$T|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -v1275(VarCurr)|$T| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] v1275(VarCurr)|$T|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] v1275(VarCurr)| -$T| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -v31(VarNext,bitIndex4)|v1266(VarNext,bitIndex3).
% 94.85/94.19  0 [] v31(VarNext,bitIndex4)| -v1266(VarNext,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1266(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)|v1266(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -v1267(VarNext)| -range_10_0(B)| -v1266(VarNext,B)|v1253(VarNext,B).
% 94.85/94.19  0 [] -v1267(VarNext)| -range_10_0(B)|v1266(VarNext,B)| -v1253(VarNext,B).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1267(VarNext)|v1268(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1267(VarNext)| -v1268(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1268(VarNext)|v1270(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1268(VarNext)|v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1268(VarNext)| -v1270(VarNext)| -v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1270(VarNext)|v1247(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1270(VarNext)| -v1247(VarNext).
% 94.85/94.19  0 [] -v31(VarNext,bitIndex3)|v1258(VarNext,bitIndex2).
% 94.85/94.19  0 [] v31(VarNext,bitIndex3)| -v1258(VarNext,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1258(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)|v1258(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -v1259(VarNext)| -range_10_0(B)| -v1258(VarNext,B)|v1253(VarNext,B).
% 94.85/94.19  0 [] -v1259(VarNext)| -range_10_0(B)|v1258(VarNext,B)| -v1253(VarNext,B).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1259(VarNext)|v1260(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1259(VarNext)| -v1260(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1260(VarNext)|v1262(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1260(VarNext)|v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1260(VarNext)| -v1262(VarNext)| -v1240(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1262(VarNext)|v1247(VarNext).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -v1262(VarNext)| -v1247(VarNext).
% 94.85/94.19  0 [] -v31(VarNext,bitIndex2)|v1242(VarNext,bitIndex1).
% 94.85/94.19  0 [] v31(VarNext,bitIndex2)| -v1242(VarNext,bitIndex1).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex10)|v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex10)| -v31(VarCurr,bitIndex11).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex9)|v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex9)| -v31(VarCurr,bitIndex10).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex8)|v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex8)| -v31(VarCurr,bitIndex9).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex7)|v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex7)| -v31(VarCurr,bitIndex8).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex6)|v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex6)| -v31(VarCurr,bitIndex7).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex5)|v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex5)| -v31(VarCurr,bitIndex6).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex4)|v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex4)| -v31(VarCurr,bitIndex5).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex3)|v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex3)| -v31(VarCurr,bitIndex4).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex2)|v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex2)| -v31(VarCurr,bitIndex3).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex1)|v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex1)| -v31(VarCurr,bitIndex2).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1242(VarNext,bitIndex0)|v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)|v1242(VarNext,bitIndex0)| -v31(VarCurr,bitIndex1).
% 94.85/94.19  0 [] -v1243(VarNext)| -range_10_0(B)| -v1242(VarNext,B)|v1253(VarNext,B).
% 94.85/94.19  0 [] -v1243(VarNext)| -range_10_0(B)|v1242(VarNext,B)| -v1253(VarNext,B).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -range_10_0(B)| -v1253(VarNext,B)|v1251(VarCurr,B).
% 94.85/94.19  0 [] -nextState(VarCurr,VarNext)| -range_10_0(B)|v1253(VarNext,B)| -v1251(VarCurr,B).
% 94.85/94.19  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex10)|v36(VarCurr,bitIndex11).
% 94.85/94.19  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex10)| -v36(VarCurr,bitIndex11).
% 94.85/94.19  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex9)|v36(VarCurr,bitIndex10).
% 94.85/94.19  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex9)| -v36(VarCurr,bitIndex10).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex8)|v36(VarCurr,bitIndex9).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex8)| -v36(VarCurr,bitIndex9).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex7)|v36(VarCurr,bitIndex8).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex7)| -v36(VarCurr,bitIndex8).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex6)|v36(VarCurr,bitIndex7).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex6)| -v36(VarCurr,bitIndex7).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex5)|v36(VarCurr,bitIndex6).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex5)| -v36(VarCurr,bitIndex6).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex4)|v36(VarCurr,bitIndex5).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex4)| -v36(VarCurr,bitIndex5).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex3)|v36(VarCurr,bitIndex4).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex3)| -v36(VarCurr,bitIndex4).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex2)|v36(VarCurr,bitIndex3).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex2)| -v36(VarCurr,bitIndex3).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex1)|v36(VarCurr,bitIndex2).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex1)| -v36(VarCurr,bitIndex2).
% 94.85/94.20  0 [] v1254(VarCurr)| -v1251(VarCurr,bitIndex0)|v36(VarCurr,bitIndex1).
% 94.85/94.20  0 [] v1254(VarCurr)|v1251(VarCurr,bitIndex0)| -v36(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v1254(VarCurr)| -range_10_0(B)| -v1251(VarCurr,B)|$F.
% 94.85/94.20  0 [] -v1254(VarCurr)| -range_10_0(B)|v1251(VarCurr,B)| -$F.
% 94.85/94.20  0 [] v1254(VarCurr)|v33(VarCurr).
% 94.85/94.20  0 [] -v1254(VarCurr)| -v33(VarCurr).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)| -v1243(VarNext)|v1244(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)|v1243(VarNext)| -v1244(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)| -v1244(VarNext)|v1245(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)| -v1244(VarNext)|v1240(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)|v1244(VarNext)| -v1245(VarNext)| -v1240(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)|v1245(VarNext)|v1247(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)| -v1245(VarNext)| -v1247(VarNext).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)| -v1247(VarNext)|v1240(VarCurr).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)|v1247(VarNext)| -v1240(VarCurr).
% 94.85/94.20  0 [] -v1240(VarCurr)|v288(VarCurr).
% 94.85/94.20  0 [] v1240(VarCurr)| -v288(VarCurr).
% 94.85/94.20  0 [] v1233(VarCurr)| -v36(VarCurr,bitIndex2)|$F.
% 94.85/94.20  0 [] v1233(VarCurr)|v36(VarCurr,bitIndex2)| -$F.
% 94.85/94.20  0 [] -v1233(VarCurr)| -v36(VarCurr,bitIndex2)|$T.
% 94.85/94.20  0 [] -v1233(VarCurr)|v36(VarCurr,bitIndex2)| -$T.
% 94.85/94.20  0 [] -v1233(VarCurr)|v1234(VarCurr)|v1235(VarCurr).
% 94.85/94.20  0 [] v1233(VarCurr)| -v1234(VarCurr).
% 94.85/94.20  0 [] v1233(VarCurr)| -v1235(VarCurr).
% 94.85/94.20  0 [] -v1235(VarCurr)|v1236(VarCurr).
% 94.85/94.20  0 [] -v1235(VarCurr)|v1238(VarCurr).
% 94.85/94.20  0 [] v1235(VarCurr)| -v1236(VarCurr)| -v1238(VarCurr).
% 94.85/94.20  0 [] -v1238(VarCurr)| -$T|v31(VarCurr,bitIndex2).
% 94.85/94.20  0 [] -v1238(VarCurr)|$T| -v31(VarCurr,bitIndex2).
% 94.85/94.20  0 [] v1238(VarCurr)|$T|v31(VarCurr,bitIndex2).
% 94.85/94.20  0 [] v1238(VarCurr)| -$T| -v31(VarCurr,bitIndex2).
% 94.85/94.20  0 [] v1236(VarCurr)|v1180(VarCurr).
% 94.85/94.20  0 [] -v1236(VarCurr)| -v1180(VarCurr).
% 94.85/94.20  0 [] -v1234(VarCurr)| -$T|v31(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v1234(VarCurr)|$T| -v31(VarCurr,bitIndex1).
% 94.85/94.20  0 [] v1234(VarCurr)|$T|v31(VarCurr,bitIndex1).
% 94.85/94.20  0 [] v1234(VarCurr)| -$T| -v31(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v31(constB0,bitIndex11)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex11)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex10)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex10)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex9)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex9)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex8)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex8)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex7)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex7)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex6)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex6)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex5)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex5)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex4)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex4)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex3)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex3)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex2)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex2)| -$F.
% 94.85/94.20  0 [] -v31(constB0,bitIndex1)|$F.
% 94.85/94.20  0 [] v31(constB0,bitIndex1)| -$F.
% 94.85/94.20  0 [] -b00000000000(bitIndex10).
% 94.85/94.20  0 [] -b00000000000(bitIndex9).
% 94.85/94.20  0 [] -b00000000000(bitIndex8).
% 94.85/94.20  0 [] -b00000000000(bitIndex7).
% 94.85/94.20  0 [] -b00000000000(bitIndex6).
% 94.85/94.20  0 [] -b00000000000(bitIndex5).
% 94.85/94.20  0 [] -b00000000000(bitIndex4).
% 94.85/94.20  0 [] -b00000000000(bitIndex3).
% 94.85/94.20  0 [] -b00000000000(bitIndex2).
% 94.85/94.20  0 [] -b00000000000(bitIndex1).
% 94.85/94.20  0 [] -b00000000000(bitIndex0).
% 94.85/94.20  0 [] -v31(constB0,bitIndex0)|$T.
% 94.85/94.20  0 [] v31(constB0,bitIndex0)| -$T.
% 94.85/94.20  0 [] -v1220(VarCurr)|v12(VarCurr).
% 94.85/94.20  0 [] v1220(VarCurr)| -v12(VarCurr).
% 94.85/94.20  0 [] -v1198(VarCurr)|v1200(VarCurr).
% 94.85/94.20  0 [] v1198(VarCurr)| -v1200(VarCurr).
% 94.85/94.20  0 [] -v1200(VarCurr)|v16(VarCurr).
% 94.85/94.20  0 [] v1200(VarCurr)| -v16(VarCurr).
% 94.85/94.20  0 [] -v1168(VarCurr)|v1170(VarCurr).
% 94.85/94.20  0 [] v1168(VarCurr)| -v1170(VarCurr).
% 94.85/94.20  0 [] -v1170(VarCurr)|v1172(VarCurr).
% 94.85/94.20  0 [] v1170(VarCurr)| -v1172(VarCurr).
% 94.85/94.20  0 [] -v1172(VarCurr)|v1174(VarCurr,bitIndex3).
% 94.85/94.20  0 [] v1172(VarCurr)| -v1174(VarCurr,bitIndex3).
% 94.85/94.20  0 [] -v1174(VarCurr,bitIndex3)|v743(VarCurr,bitIndex3).
% 94.85/94.20  0 [] v1174(VarCurr,bitIndex3)| -v743(VarCurr,bitIndex3).
% 94.85/94.20  0 [] -v1162(VarCurr)|v1164(VarCurr).
% 94.85/94.20  0 [] v1162(VarCurr)| -v1164(VarCurr).
% 94.85/94.20  0 [] -v1164(VarCurr)|v1166(VarCurr).
% 94.85/94.20  0 [] v1164(VarCurr)| -v1166(VarCurr).
% 94.85/94.20  0 [] -v1166(VarCurr)|v915(VarCurr,bitIndex1).
% 94.85/94.20  0 [] v1166(VarCurr)| -v915(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v1148(VarCurr)|v1156(VarCurr).
% 94.85/94.20  0 [] -v1148(VarCurr)|v1158(VarCurr).
% 94.85/94.20  0 [] v1148(VarCurr)| -v1156(VarCurr)| -v1158(VarCurr).
% 94.85/94.20  0 [] v1158(VarCurr)|v1150(VarCurr).
% 94.85/94.20  0 [] -v1158(VarCurr)| -v1150(VarCurr).
% 94.85/94.20  0 [] -v1156(VarCurr)|v1157(VarCurr).
% 94.85/94.20  0 [] -v1156(VarCurr)|v909(VarCurr).
% 94.85/94.20  0 [] v1156(VarCurr)| -v1157(VarCurr)| -v909(VarCurr).
% 94.85/94.20  0 [] v1157(VarCurr)|v1031(VarCurr).
% 94.85/94.20  0 [] -v1157(VarCurr)| -v1031(VarCurr).
% 94.85/94.20  0 [] -v1150(VarCurr)|v1152(VarCurr).
% 94.85/94.20  0 [] v1150(VarCurr)| -v1152(VarCurr).
% 94.85/94.20  0 [] -v1152(VarCurr)|v1154(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v1152(VarCurr)| -v1154(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -v1154(VarCurr,bitIndex0)|v1142(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v1154(VarCurr,bitIndex0)| -v1142(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -v1142(VarCurr,bitIndex0)|v919(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v1142(VarCurr,bitIndex0)| -v919(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -v919(VarCurr,bitIndex0)|v921(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v919(VarCurr,bitIndex0)| -v921(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -v921(VarCurr,bitIndex0)|v1017(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v921(VarCurr,bitIndex0)| -v1017(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -v1029(VarCurr)|v1146(VarCurr).
% 94.85/94.20  0 [] -v1029(VarCurr)|v1132(VarCurr).
% 94.85/94.20  0 [] v1029(VarCurr)| -v1146(VarCurr)| -v1132(VarCurr).
% 94.85/94.20  0 [] v1146(VarCurr)|v1031(VarCurr).
% 94.85/94.20  0 [] -v1146(VarCurr)| -v1031(VarCurr).
% 94.85/94.20  0 [] -v1132(VarCurr)|v1134(VarCurr).
% 94.85/94.20  0 [] v1132(VarCurr)| -v1134(VarCurr).
% 94.85/94.20  0 [] -v1134(VarCurr)|v1136(VarCurr).
% 94.85/94.20  0 [] v1134(VarCurr)| -v1136(VarCurr).
% 94.85/94.20  0 [] -v1136(VarCurr)|v1144(VarCurr).
% 94.85/94.20  0 [] -v1136(VarCurr)|v1138(VarCurr).
% 94.85/94.20  0 [] v1136(VarCurr)| -v1144(VarCurr)| -v1138(VarCurr).
% 94.85/94.20  0 [] v1144(VarCurr)|v915(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v1144(VarCurr)| -v915(VarCurr,bitIndex1).
% 94.85/94.20  0 [] -v1138(VarCurr)|v1140(VarCurr).
% 94.85/94.20  0 [] v1138(VarCurr)| -v1140(VarCurr).
% 94.85/94.20  0 [] -v1140(VarCurr)|v1142(VarCurr,bitIndex15).
% 94.85/94.20  0 [] v1140(VarCurr)| -v1142(VarCurr,bitIndex15).
% 94.85/94.20  0 [] -v1142(VarCurr,bitIndex15)|v919(VarCurr,bitIndex15).
% 94.85/94.20  0 [] v1142(VarCurr,bitIndex15)| -v919(VarCurr,bitIndex15).
% 94.85/94.20  0 [] -v919(VarCurr,bitIndex15)|v921(VarCurr,bitIndex15).
% 94.85/94.20  0 [] v919(VarCurr,bitIndex15)| -v921(VarCurr,bitIndex15).
% 94.85/94.20  0 [] -v921(VarCurr,bitIndex15)|v1017(VarCurr,bitIndex15).
% 94.85/94.20  0 [] v921(VarCurr,bitIndex15)| -v1017(VarCurr,bitIndex15).
% 94.85/94.20  0 [] -v1031(VarCurr)|v1033(VarCurr).
% 94.85/94.20  0 [] v1031(VarCurr)| -v1033(VarCurr).
% 94.85/94.20  0 [] -v1033(VarCurr)|v1035(VarCurr).
% 94.85/94.20  0 [] v1033(VarCurr)| -v1035(VarCurr).
% 94.85/94.20  0 [] -v1035(VarCurr)| -v1037(VarCurr,bitIndex4)|$F.
% 94.85/94.20  0 [] -v1035(VarCurr)|v1037(VarCurr,bitIndex4)| -$F.
% 94.85/94.20  0 [] -v1035(VarCurr)| -v1037(VarCurr,bitIndex3)|$F.
% 94.85/94.20  0 [] -v1035(VarCurr)|v1037(VarCurr,bitIndex3)| -$F.
% 94.85/94.20  0 [] -v1035(VarCurr)| -v1037(VarCurr,bitIndex2)|$F.
% 94.85/94.20  0 [] -v1035(VarCurr)|v1037(VarCurr,bitIndex2)| -$F.
% 94.85/94.20  0 [] -v1035(VarCurr)| -v1037(VarCurr,bitIndex1)|$F.
% 94.85/94.20  0 [] -v1035(VarCurr)|v1037(VarCurr,bitIndex1)| -$F.
% 94.85/94.20  0 [] -v1035(VarCurr)| -v1037(VarCurr,bitIndex0)|$F.
% 94.85/94.20  0 [] -v1035(VarCurr)|v1037(VarCurr,bitIndex0)| -$F.
% 94.85/94.20  0 [] v1035(VarCurr)|v1037(VarCurr,bitIndex4)|$F|v1037(VarCurr,bitIndex3)|v1037(VarCurr,bitIndex2)|v1037(VarCurr,bitIndex1)|v1037(VarCurr,bitIndex0).
% 94.85/94.20  0 [] v1035(VarCurr)| -v1037(VarCurr,bitIndex4)| -$F| -v1037(VarCurr,bitIndex3)| -v1037(VarCurr,bitIndex2)| -v1037(VarCurr,bitIndex1)| -v1037(VarCurr,bitIndex0).
% 94.85/94.20  0 [] -nextState(VarCurr,VarNext)|v1118(VarNext)| -range_4_0(B)| -v1037(VarNext,B)|v1037(VarCurr,B).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)|v1118(VarNext)| -range_4_0(B)|v1037(VarNext,B)| -v1037(VarCurr,B).
% 94.85/94.21  0 [] -v1118(VarNext)| -range_4_0(B)| -v1037(VarNext,B)|v1126(VarNext,B).
% 94.85/94.21  0 [] -v1118(VarNext)| -range_4_0(B)|v1037(VarNext,B)| -v1126(VarNext,B).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v1126(VarNext,B)|v1124(VarCurr,B).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v1126(VarNext,B)| -v1124(VarCurr,B).
% 94.85/94.21  0 [] v1127(VarCurr)| -range_4_0(B)| -v1124(VarCurr,B)|v1039(VarCurr,B).
% 94.85/94.21  0 [] v1127(VarCurr)| -range_4_0(B)|v1124(VarCurr,B)| -v1039(VarCurr,B).
% 94.85/94.21  0 [] -v1127(VarCurr)| -range_4_0(B)| -v1124(VarCurr,B)|$F.
% 94.85/94.21  0 [] -v1127(VarCurr)| -range_4_0(B)|v1124(VarCurr,B)| -$F.
% 94.85/94.21  0 [] v1127(VarCurr)|v928(VarCurr).
% 94.85/94.21  0 [] -v1127(VarCurr)| -v928(VarCurr).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -v1118(VarNext)|v1119(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)|v1118(VarNext)| -v1119(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -v1119(VarNext)|v1120(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -v1119(VarNext)|v925(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)|v1119(VarNext)| -v1120(VarNext)| -v925(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)|v1120(VarNext)|v984(VarNext).
% 94.85/94.21  0 [] -nextState(VarCurr,VarNext)| -v1120(VarNext)| -v984(VarNext).
% 94.85/94.21  0 [] v1042(VarCurr)|v1044(VarCurr)|v1085(VarCurr)| -range_4_0(B)| -v1039(VarCurr,B)|v1037(VarCurr,B).
% 94.85/94.21  0 [] v1042(VarCurr)|v1044(VarCurr)|v1085(VarCurr)| -range_4_0(B)|v1039(VarCurr,B)| -v1037(VarCurr,B).
% 94.85/94.21  0 [] -v1085(VarCurr)| -range_4_0(B)| -v1039(VarCurr,B)|v1087(VarCurr,B).
% 94.85/94.21  0 [] -v1085(VarCurr)| -range_4_0(B)|v1039(VarCurr,B)| -v1087(VarCurr,B).
% 94.85/94.21  0 [] -v1044(VarCurr)| -range_4_0(B)| -v1039(VarCurr,B)|v1046(VarCurr,B).
% 94.85/94.21  0 [] -v1044(VarCurr)| -range_4_0(B)|v1039(VarCurr,B)| -v1046(VarCurr,B).
% 94.85/94.21  0 [] -v1042(VarCurr)| -range_4_0(B)| -v1039(VarCurr,B)|v1037(VarCurr,B).
% 94.85/94.21  0 [] -v1042(VarCurr)| -range_4_0(B)|v1039(VarCurr,B)| -v1037(VarCurr,B).
% 94.85/94.21  0 [] -v1114(VarCurr)| -v1115(VarCurr,bitIndex1)|$T.
% 94.85/94.21  0 [] -v1114(VarCurr)|v1115(VarCurr,bitIndex1)| -$T.
% 94.85/94.21  0 [] -v1114(VarCurr)| -v1115(VarCurr,bitIndex0)|$T.
% 94.85/94.21  0 [] -v1114(VarCurr)|v1115(VarCurr,bitIndex0)| -$T.
% 94.85/94.21  0 [] v1114(VarCurr)|v1115(VarCurr,bitIndex1)|$T|v1115(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1114(VarCurr)| -v1115(VarCurr,bitIndex1)| -$T| -v1115(VarCurr,bitIndex0).
% 94.85/94.21  0 [] -v1115(VarCurr,bitIndex0)|v1023(VarCurr).
% 94.85/94.21  0 [] v1115(VarCurr,bitIndex0)| -v1023(VarCurr).
% 94.85/94.21  0 [] -v1115(VarCurr,bitIndex1)|v945(VarCurr).
% 94.85/94.21  0 [] v1115(VarCurr,bitIndex1)| -v945(VarCurr).
% 94.85/94.21  0 [] v1088(VarCurr)| -range_4_0(B)| -v1087(VarCurr,B)|v1089(VarCurr,B).
% 94.85/94.21  0 [] v1088(VarCurr)| -range_4_0(B)|v1087(VarCurr,B)| -v1089(VarCurr,B).
% 94.85/94.21  0 [] -v1088(VarCurr)| -range_4_0(B)| -v1087(VarCurr,B)|b10000(B).
% 94.85/94.21  0 [] -v1088(VarCurr)| -range_4_0(B)|v1087(VarCurr,B)| -b10000(B).
% 94.85/94.21  0 [] -v1089(VarCurr,bitIndex0)|v1111(VarCurr).
% 94.85/94.21  0 [] v1089(VarCurr,bitIndex0)| -v1111(VarCurr).
% 94.85/94.21  0 [] -v1089(VarCurr,bitIndex1)|v1109(VarCurr).
% 94.85/94.21  0 [] v1089(VarCurr,bitIndex1)| -v1109(VarCurr).
% 94.85/94.21  0 [] -v1089(VarCurr,bitIndex2)|v1104(VarCurr).
% 94.85/94.21  0 [] v1089(VarCurr,bitIndex2)| -v1104(VarCurr).
% 94.85/94.21  0 [] -v1089(VarCurr,bitIndex3)|v1099(VarCurr).
% 94.85/94.21  0 [] v1089(VarCurr,bitIndex3)| -v1099(VarCurr).
% 94.85/94.21  0 [] -v1089(VarCurr,bitIndex4)|v1091(VarCurr).
% 94.85/94.21  0 [] v1089(VarCurr,bitIndex4)| -v1091(VarCurr).
% 94.85/94.21  0 [] -v1109(VarCurr)|v1110(VarCurr).
% 94.85/94.21  0 [] -v1109(VarCurr)|v1113(VarCurr).
% 94.85/94.21  0 [] v1109(VarCurr)| -v1110(VarCurr)| -v1113(VarCurr).
% 94.85/94.21  0 [] -v1113(VarCurr)|v1037(VarCurr,bitIndex0)|v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] v1113(VarCurr)| -v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1113(VarCurr)| -v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] -v1110(VarCurr)|v1111(VarCurr)|v1112(VarCurr).
% 94.85/94.21  0 [] v1110(VarCurr)| -v1111(VarCurr).
% 94.85/94.21  0 [] v1110(VarCurr)| -v1112(VarCurr).
% 94.85/94.21  0 [] v1112(VarCurr)|v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] -v1112(VarCurr)| -v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] v1111(VarCurr)|v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] -v1111(VarCurr)| -v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] -v1104(VarCurr)|v1105(VarCurr).
% 94.85/94.21  0 [] -v1104(VarCurr)|v1108(VarCurr).
% 94.85/94.21  0 [] v1104(VarCurr)| -v1105(VarCurr)| -v1108(VarCurr).
% 94.85/94.21  0 [] -v1108(VarCurr)|v1096(VarCurr)|v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] v1108(VarCurr)| -v1096(VarCurr).
% 94.85/94.21  0 [] v1108(VarCurr)| -v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] -v1105(VarCurr)|v1106(VarCurr)|v1107(VarCurr).
% 94.85/94.21  0 [] v1105(VarCurr)| -v1106(VarCurr).
% 94.85/94.21  0 [] v1105(VarCurr)| -v1107(VarCurr).
% 94.85/94.21  0 [] v1107(VarCurr)|v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] -v1107(VarCurr)| -v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] v1106(VarCurr)|v1096(VarCurr).
% 94.85/94.21  0 [] -v1106(VarCurr)| -v1096(VarCurr).
% 94.85/94.21  0 [] -v1099(VarCurr)|v1100(VarCurr).
% 94.85/94.21  0 [] -v1099(VarCurr)|v1103(VarCurr).
% 94.85/94.21  0 [] v1099(VarCurr)| -v1100(VarCurr)| -v1103(VarCurr).
% 94.85/94.21  0 [] -v1103(VarCurr)|v1095(VarCurr)|v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] v1103(VarCurr)| -v1095(VarCurr).
% 94.85/94.21  0 [] v1103(VarCurr)| -v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] -v1100(VarCurr)|v1101(VarCurr)|v1102(VarCurr).
% 94.85/94.21  0 [] v1100(VarCurr)| -v1101(VarCurr).
% 94.85/94.21  0 [] v1100(VarCurr)| -v1102(VarCurr).
% 94.85/94.21  0 [] v1102(VarCurr)|v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] -v1102(VarCurr)| -v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] v1101(VarCurr)|v1095(VarCurr).
% 94.85/94.21  0 [] -v1101(VarCurr)| -v1095(VarCurr).
% 94.85/94.21  0 [] -v1091(VarCurr)|v1092(VarCurr).
% 94.85/94.21  0 [] -v1091(VarCurr)|v1098(VarCurr).
% 94.85/94.21  0 [] v1091(VarCurr)| -v1092(VarCurr)| -v1098(VarCurr).
% 94.85/94.21  0 [] -v1098(VarCurr)|v1094(VarCurr)|v1037(VarCurr,bitIndex4).
% 94.85/94.21  0 [] v1098(VarCurr)| -v1094(VarCurr).
% 94.85/94.21  0 [] v1098(VarCurr)| -v1037(VarCurr,bitIndex4).
% 94.85/94.21  0 [] -v1092(VarCurr)|v1093(VarCurr)|v1097(VarCurr).
% 94.85/94.21  0 [] v1092(VarCurr)| -v1093(VarCurr).
% 94.85/94.21  0 [] v1092(VarCurr)| -v1097(VarCurr).
% 94.85/94.21  0 [] v1097(VarCurr)|v1037(VarCurr,bitIndex4).
% 94.85/94.21  0 [] -v1097(VarCurr)| -v1037(VarCurr,bitIndex4).
% 94.85/94.21  0 [] v1093(VarCurr)|v1094(VarCurr).
% 94.85/94.21  0 [] -v1093(VarCurr)| -v1094(VarCurr).
% 94.85/94.21  0 [] -v1094(VarCurr)|v1095(VarCurr).
% 94.85/94.21  0 [] -v1094(VarCurr)|v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] v1094(VarCurr)| -v1095(VarCurr)| -v1037(VarCurr,bitIndex3).
% 94.85/94.21  0 [] -v1095(VarCurr)|v1096(VarCurr).
% 94.85/94.21  0 [] -v1095(VarCurr)|v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] v1095(VarCurr)| -v1096(VarCurr)| -v1037(VarCurr,bitIndex2).
% 94.85/94.21  0 [] -v1096(VarCurr)|v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] -v1096(VarCurr)|v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] v1096(VarCurr)| -v1037(VarCurr,bitIndex0)| -v1037(VarCurr,bitIndex1).
% 94.85/94.21  0 [] -v1088(VarCurr)| -v1037(VarCurr,bitIndex4)|$T.
% 94.85/94.21  0 [] -v1088(VarCurr)|v1037(VarCurr,bitIndex4)| -$T.
% 94.85/94.21  0 [] -v1088(VarCurr)| -v1037(VarCurr,bitIndex3)|$F.
% 94.85/94.21  0 [] -v1088(VarCurr)|v1037(VarCurr,bitIndex3)| -$F.
% 94.85/94.21  0 [] -v1088(VarCurr)| -v1037(VarCurr,bitIndex2)|$F.
% 94.85/94.21  0 [] -v1088(VarCurr)|v1037(VarCurr,bitIndex2)| -$F.
% 94.85/94.21  0 [] -v1088(VarCurr)| -v1037(VarCurr,bitIndex1)|$F.
% 94.85/94.21  0 [] -v1088(VarCurr)|v1037(VarCurr,bitIndex1)| -$F.
% 94.85/94.21  0 [] -v1088(VarCurr)| -v1037(VarCurr,bitIndex0)|$F.
% 94.85/94.21  0 [] -v1088(VarCurr)|v1037(VarCurr,bitIndex0)| -$F.
% 94.85/94.21  0 [] v1088(VarCurr)|v1037(VarCurr,bitIndex4)|$T|v1037(VarCurr,bitIndex3)|$F|v1037(VarCurr,bitIndex2)|v1037(VarCurr,bitIndex1)|v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1088(VarCurr)|v1037(VarCurr,bitIndex4)|$T| -v1037(VarCurr,bitIndex3)| -$F| -v1037(VarCurr,bitIndex2)| -v1037(VarCurr,bitIndex1)| -v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1088(VarCurr)| -v1037(VarCurr,bitIndex4)| -$T|v1037(VarCurr,bitIndex3)|$F|v1037(VarCurr,bitIndex2)|v1037(VarCurr,bitIndex1)|v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1088(VarCurr)| -v1037(VarCurr,bitIndex4)| -$T| -v1037(VarCurr,bitIndex3)| -$F| -v1037(VarCurr,bitIndex2)| -v1037(VarCurr,bitIndex1)| -v1037(VarCurr,bitIndex0).
% 94.85/94.21  0 [] b10000(bitIndex4).
% 94.85/94.21  0 [] -b10000(bitIndex3).
% 94.85/94.21  0 [] -b10000(bitIndex2).
% 94.85/94.21  0 [] -b10000(bitIndex1).
% 94.85/94.21  0 [] -b10000(bitIndex0).
% 94.85/94.21  0 [] -v1085(VarCurr)| -v1086(VarCurr,bitIndex1)|$T.
% 94.85/94.21  0 [] -v1085(VarCurr)|v1086(VarCurr,bitIndex1)| -$T.
% 94.85/94.21  0 [] -v1085(VarCurr)| -v1086(VarCurr,bitIndex0)|$F.
% 94.85/94.21  0 [] -v1085(VarCurr)|v1086(VarCurr,bitIndex0)| -$F.
% 94.85/94.21  0 [] v1085(VarCurr)|v1086(VarCurr,bitIndex1)|$T|v1086(VarCurr,bitIndex0)|$F.
% 94.85/94.21  0 [] v1085(VarCurr)|v1086(VarCurr,bitIndex1)|$T| -v1086(VarCurr,bitIndex0)| -$F.
% 94.85/94.21  0 [] v1085(VarCurr)| -v1086(VarCurr,bitIndex1)| -$T|v1086(VarCurr,bitIndex0)|$F.
% 94.85/94.21  0 [] v1085(VarCurr)| -v1086(VarCurr,bitIndex1)| -$T| -v1086(VarCurr,bitIndex0)| -$F.
% 94.85/94.21  0 [] -v1086(VarCurr,bitIndex0)|v1023(VarCurr).
% 94.85/94.21  0 [] v1086(VarCurr,bitIndex0)| -v1023(VarCurr).
% 94.85/94.21  0 [] -v1086(VarCurr,bitIndex1)|v945(VarCurr).
% 94.85/94.21  0 [] v1086(VarCurr,bitIndex1)| -v945(VarCurr).
% 94.85/94.21  0 [] v1047(VarCurr)| -range_31_0(B)| -v1046(VarCurr,B)|v1048(VarCurr,B).
% 94.85/94.21  0 [] v1047(VarCurr)| -range_31_0(B)|v1046(VarCurr,B)| -v1048(VarCurr,B).
% 94.85/94.21  0 [] -v1047(VarCurr)| -range_31_0(B)| -v1046(VarCurr,B)|$F.
% 94.85/94.21  0 [] -v1047(VarCurr)| -range_31_0(B)|v1046(VarCurr,B)| -$F.
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex6)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex6)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex7)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex7)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex8)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex8)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex9)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex9)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex10)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex10)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex11)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex11)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex12)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex12)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex13)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex13)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex14)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex14)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex15)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex15)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex16)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex16)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex17)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex17)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex18)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex18)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex19)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex19)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex20)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex20)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex21)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex21)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex22)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex22)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex23)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex23)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex24)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex24)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex25)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex25)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex26)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex26)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex27)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex27)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex28)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex28)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex29)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex29)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex30)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex30)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -v1048(VarCurr,bitIndex31)|v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] v1048(VarCurr,bitIndex31)| -v1049(VarCurr,bitIndex5).
% 94.85/94.21  0 [] -range_5_0(B)| -v1048(VarCurr,B)|v1049(VarCurr,B).
% 94.85/94.21  0 [] -range_5_0(B)|v1048(VarCurr,B)| -v1049(VarCurr,B).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex0)|v1083(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex0)| -v1083(VarCurr).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex1)|v1081(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex1)| -v1081(VarCurr).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex2)|v1077(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex2)| -v1077(VarCurr).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex3)|v1073(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex3)| -v1073(VarCurr).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex4)|v1069(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex4)| -v1069(VarCurr).
% 94.85/94.21  0 [] -v1049(VarCurr,bitIndex5)|v1051(VarCurr).
% 94.85/94.21  0 [] v1049(VarCurr,bitIndex5)| -v1051(VarCurr).
% 94.85/94.21  0 [] -v1081(VarCurr)|v1082(VarCurr).
% 94.85/94.21  0 [] -v1081(VarCurr)|v1084(VarCurr).
% 94.85/94.21  0 [] v1081(VarCurr)| -v1082(VarCurr)| -v1084(VarCurr).
% 94.85/94.21  0 [] -v1084(VarCurr)|v1055(VarCurr,bitIndex0)|v1063(VarCurr).
% 94.85/94.21  0 [] v1084(VarCurr)| -v1055(VarCurr,bitIndex0).
% 94.85/94.21  0 [] v1084(VarCurr)| -v1063(VarCurr).
% 94.85/94.22  0 [] -v1082(VarCurr)|v1083(VarCurr)|v1055(VarCurr,bitIndex1).
% 94.85/94.22  0 [] v1082(VarCurr)| -v1083(VarCurr).
% 94.85/94.22  0 [] v1082(VarCurr)| -v1055(VarCurr,bitIndex1).
% 94.85/94.22  0 [] v1083(VarCurr)|v1055(VarCurr,bitIndex0).
% 94.85/94.22  0 [] -v1083(VarCurr)| -v1055(VarCurr,bitIndex0).
% 94.85/94.22  0 [] -v1077(VarCurr)|v1078(VarCurr).
% 94.85/94.22  0 [] -v1077(VarCurr)|v1080(VarCurr).
% 94.85/94.22  0 [] v1077(VarCurr)| -v1078(VarCurr)| -v1080(VarCurr).
% 94.85/94.22  0 [] -v1080(VarCurr)|v1061(VarCurr)|v1064(VarCurr).
% 94.85/94.22  0 [] v1080(VarCurr)| -v1061(VarCurr).
% 94.85/94.22  0 [] v1080(VarCurr)| -v1064(VarCurr).
% 94.85/94.22  0 [] -v1078(VarCurr)|v1079(VarCurr)|v1055(VarCurr,bitIndex2).
% 94.85/94.22  0 [] v1078(VarCurr)| -v1079(VarCurr).
% 94.85/94.22  0 [] v1078(VarCurr)| -v1055(VarCurr,bitIndex2).
% 94.85/94.22  0 [] v1079(VarCurr)|v1061(VarCurr).
% 94.85/94.22  0 [] -v1079(VarCurr)| -v1061(VarCurr).
% 94.85/94.22  0 [] -v1073(VarCurr)|v1074(VarCurr).
% 94.85/94.22  0 [] -v1073(VarCurr)|v1076(VarCurr).
% 94.85/94.22  0 [] v1073(VarCurr)| -v1074(VarCurr)| -v1076(VarCurr).
% 94.85/94.22  0 [] -v1076(VarCurr)|v1059(VarCurr)|v1065(VarCurr).
% 94.85/94.22  0 [] v1076(VarCurr)| -v1059(VarCurr).
% 94.85/94.22  0 [] v1076(VarCurr)| -v1065(VarCurr).
% 94.85/94.22  0 [] -v1074(VarCurr)|v1075(VarCurr)|v1055(VarCurr,bitIndex3).
% 94.85/94.22  0 [] v1074(VarCurr)| -v1075(VarCurr).
% 94.85/94.22  0 [] v1074(VarCurr)| -v1055(VarCurr,bitIndex3).
% 94.85/94.22  0 [] v1075(VarCurr)|v1059(VarCurr).
% 94.85/94.22  0 [] -v1075(VarCurr)| -v1059(VarCurr).
% 94.85/94.22  0 [] -v1069(VarCurr)|v1070(VarCurr).
% 94.85/94.22  0 [] -v1069(VarCurr)|v1072(VarCurr).
% 94.85/94.22  0 [] v1069(VarCurr)| -v1070(VarCurr)| -v1072(VarCurr).
% 94.85/94.22  0 [] -v1072(VarCurr)|v1057(VarCurr)|v1066(VarCurr).
% 94.85/94.22  0 [] v1072(VarCurr)| -v1057(VarCurr).
% 94.85/94.22  0 [] v1072(VarCurr)| -v1066(VarCurr).
% 94.85/94.22  0 [] -v1070(VarCurr)|v1071(VarCurr)|v1055(VarCurr,bitIndex4).
% 94.85/94.22  0 [] v1070(VarCurr)| -v1071(VarCurr).
% 94.85/94.22  0 [] v1070(VarCurr)| -v1055(VarCurr,bitIndex4).
% 94.85/94.22  0 [] v1071(VarCurr)|v1057(VarCurr).
% 94.85/94.22  0 [] -v1071(VarCurr)| -v1057(VarCurr).
% 94.85/94.22  0 [] -v1051(VarCurr)|v1052(VarCurr).
% 94.85/94.22  0 [] -v1051(VarCurr)|v1067(VarCurr).
% 94.85/94.22  0 [] v1051(VarCurr)| -v1052(VarCurr)| -v1067(VarCurr).
% 94.85/94.22  0 [] -v1067(VarCurr)|v1054(VarCurr)|v1068(VarCurr).
% 94.85/94.22  0 [] v1067(VarCurr)| -v1054(VarCurr).
% 94.85/94.22  0 [] v1067(VarCurr)| -v1068(VarCurr).
% 94.85/94.22  0 [] v1068(VarCurr)|v1055(VarCurr,bitIndex5).
% 94.85/94.22  0 [] -v1068(VarCurr)| -v1055(VarCurr,bitIndex5).
% 94.85/94.22  0 [] -v1052(VarCurr)|v1053(VarCurr)|v1055(VarCurr,bitIndex5).
% 94.85/94.22  0 [] v1052(VarCurr)| -v1053(VarCurr).
% 94.85/94.22  0 [] v1052(VarCurr)| -v1055(VarCurr,bitIndex5).
% 94.85/94.22  0 [] v1053(VarCurr)|v1054(VarCurr).
% 94.85/94.22  0 [] -v1053(VarCurr)| -v1054(VarCurr).
% 94.85/94.22  0 [] -v1054(VarCurr)|v1055(VarCurr,bitIndex4)|v1056(VarCurr).
% 94.85/94.22  0 [] v1054(VarCurr)| -v1055(VarCurr,bitIndex4).
% 94.85/94.22  0 [] v1054(VarCurr)| -v1056(VarCurr).
% 94.85/94.22  0 [] -v1056(VarCurr)|v1057(VarCurr).
% 94.85/94.22  0 [] -v1056(VarCurr)|v1066(VarCurr).
% 94.85/94.22  0 [] v1056(VarCurr)| -v1057(VarCurr)| -v1066(VarCurr).
% 94.85/94.22  0 [] v1066(VarCurr)|v1055(VarCurr,bitIndex4).
% 94.85/94.22  0 [] -v1066(VarCurr)| -v1055(VarCurr,bitIndex4).
% 94.85/94.22  0 [] -v1057(VarCurr)|v1055(VarCurr,bitIndex3)|v1058(VarCurr).
% 94.85/94.22  0 [] v1057(VarCurr)| -v1055(VarCurr,bitIndex3).
% 94.85/94.22  0 [] v1057(VarCurr)| -v1058(VarCurr).
% 94.85/94.22  0 [] -v1058(VarCurr)|v1059(VarCurr).
% 94.85/94.22  0 [] -v1058(VarCurr)|v1065(VarCurr).
% 94.85/94.22  0 [] v1058(VarCurr)| -v1059(VarCurr)| -v1065(VarCurr).
% 94.85/94.22  0 [] v1065(VarCurr)|v1055(VarCurr,bitIndex3).
% 94.85/94.22  0 [] -v1065(VarCurr)| -v1055(VarCurr,bitIndex3).
% 94.85/94.22  0 [] -v1059(VarCurr)|v1055(VarCurr,bitIndex2)|v1060(VarCurr).
% 94.85/94.22  0 [] v1059(VarCurr)| -v1055(VarCurr,bitIndex2).
% 94.85/94.22  0 [] v1059(VarCurr)| -v1060(VarCurr).
% 94.85/94.22  0 [] -v1060(VarCurr)|v1061(VarCurr).
% 94.85/94.22  0 [] -v1060(VarCurr)|v1064(VarCurr).
% 94.85/94.22  0 [] v1060(VarCurr)| -v1061(VarCurr)| -v1064(VarCurr).
% 94.85/94.22  0 [] v1064(VarCurr)|v1055(VarCurr,bitIndex2).
% 94.85/94.22  0 [] -v1064(VarCurr)| -v1055(VarCurr,bitIndex2).
% 94.85/94.22  0 [] -v1061(VarCurr)|v1055(VarCurr,bitIndex1)|v1062(VarCurr).
% 94.85/94.22  0 [] v1061(VarCurr)| -v1055(VarCurr,bitIndex1).
% 94.85/94.22  0 [] v1061(VarCurr)| -v1062(VarCurr).
% 94.85/94.22  0 [] -v1062(VarCurr)|v1055(VarCurr,bitIndex0).
% 94.85/94.22  0 [] -v1062(VarCurr)|v1063(VarCurr).
% 94.85/94.22  0 [] v1062(VarCurr)| -v1055(VarCurr,bitIndex0)| -v1063(VarCurr).
% 94.85/94.22  0 [] v1063(VarCurr)|v1055(VarCurr,bitIndex1).
% 94.85/94.22  0 [] -v1063(VarCurr)| -v1055(VarCurr,bitIndex1).
% 94.85/94.22  0 [] -v1055(VarCurr,bitIndex5).
% 94.85/94.22  0 [] -range_4_0(B)| -v1055(VarCurr,B)|v1037(VarCurr,B).
% 94.85/94.22  0 [] -range_4_0(B)|v1055(VarCurr,B)| -v1037(VarCurr,B).
% 94.85/94.22  0 [] -v1047(VarCurr)| -v1037(VarCurr,bitIndex4)|$F.
% 94.85/94.22  0 [] -v1047(VarCurr)|v1037(VarCurr,bitIndex4)| -$F.
% 94.85/94.22  0 [] -v1047(VarCurr)| -v1037(VarCurr,bitIndex3)|$F.
% 94.85/94.22  0 [] -v1047(VarCurr)|v1037(VarCurr,bitIndex3)| -$F.
% 94.85/94.22  0 [] -v1047(VarCurr)| -v1037(VarCurr,bitIndex2)|$F.
% 94.85/94.22  0 [] -v1047(VarCurr)|v1037(VarCurr,bitIndex2)| -$F.
% 94.85/94.22  0 [] -v1047(VarCurr)| -v1037(VarCurr,bitIndex1)|$F.
% 94.85/94.22  0 [] -v1047(VarCurr)|v1037(VarCurr,bitIndex1)| -$F.
% 94.85/94.22  0 [] -v1047(VarCurr)| -v1037(VarCurr,bitIndex0)|$F.
% 94.85/94.22  0 [] -v1047(VarCurr)|v1037(VarCurr,bitIndex0)| -$F.
% 94.85/94.22  0 [] v1047(VarCurr)|v1037(VarCurr,bitIndex4)|$F|v1037(VarCurr,bitIndex3)|v1037(VarCurr,bitIndex2)|v1037(VarCurr,bitIndex1)|v1037(VarCurr,bitIndex0).
% 94.85/94.22  0 [] v1047(VarCurr)| -v1037(VarCurr,bitIndex4)| -$F| -v1037(VarCurr,bitIndex3)| -v1037(VarCurr,bitIndex2)| -v1037(VarCurr,bitIndex1)| -v1037(VarCurr,bitIndex0).
% 94.85/94.22  0 [] -v1044(VarCurr)| -v1045(VarCurr,bitIndex1)|$F.
% 94.85/94.22  0 [] -v1044(VarCurr)|v1045(VarCurr,bitIndex1)| -$F.
% 94.85/94.22  0 [] -v1044(VarCurr)| -v1045(VarCurr,bitIndex0)|$T.
% 94.85/94.22  0 [] -v1044(VarCurr)|v1045(VarCurr,bitIndex0)| -$T.
% 94.85/94.22  0 [] v1044(VarCurr)|v1045(VarCurr,bitIndex1)|$F|v1045(VarCurr,bitIndex0)|$T.
% 94.85/94.22  0 [] v1044(VarCurr)|v1045(VarCurr,bitIndex1)|$F| -v1045(VarCurr,bitIndex0)| -$T.
% 94.85/94.22  0 [] v1044(VarCurr)| -v1045(VarCurr,bitIndex1)| -$F|v1045(VarCurr,bitIndex0)|$T.
% 94.85/94.22  0 [] v1044(VarCurr)| -v1045(VarCurr,bitIndex1)| -$F| -v1045(VarCurr,bitIndex0)| -$T.
% 94.85/94.22  0 [] -v1045(VarCurr,bitIndex0)|v1023(VarCurr).
% 94.85/94.22  0 [] v1045(VarCurr,bitIndex0)| -v1023(VarCurr).
% 94.85/94.22  0 [] -v1045(VarCurr,bitIndex1)|v945(VarCurr).
% 94.85/94.22  0 [] v1045(VarCurr,bitIndex1)| -v945(VarCurr).
% 94.85/94.22  0 [] -v1037(constB0,bitIndex4).
% 94.85/94.22  0 [] -v1037(constB0,bitIndex3).
% 94.85/94.22  0 [] -v1037(constB0,bitIndex2).
% 94.85/94.22  0 [] -v1037(constB0,bitIndex1).
% 94.85/94.22  0 [] v1037(constB0,bitIndex0).
% 94.85/94.22  0 [] -b00001(bitIndex4).
% 94.85/94.22  0 [] -b00001(bitIndex3).
% 94.85/94.22  0 [] -b00001(bitIndex2).
% 94.85/94.22  0 [] -b00001(bitIndex1).
% 94.85/94.22  0 [] b00001(bitIndex0).
% 94.85/94.22  0 [] -v1042(VarCurr)| -v1043(VarCurr,bitIndex1)|$F.
% 94.85/94.22  0 [] -v1042(VarCurr)|v1043(VarCurr,bitIndex1)| -$F.
% 94.85/94.22  0 [] -v1042(VarCurr)| -v1043(VarCurr,bitIndex0)|$F.
% 94.85/94.22  0 [] -v1042(VarCurr)|v1043(VarCurr,bitIndex0)| -$F.
% 94.85/94.22  0 [] v1042(VarCurr)|v1043(VarCurr,bitIndex1)|$F|v1043(VarCurr,bitIndex0).
% 94.85/94.22  0 [] v1042(VarCurr)| -v1043(VarCurr,bitIndex1)| -$F| -v1043(VarCurr,bitIndex0).
% 94.85/94.22  0 [] -v1043(VarCurr,bitIndex0)|v1023(VarCurr).
% 94.85/94.22  0 [] v1043(VarCurr,bitIndex0)| -v1023(VarCurr).
% 94.85/94.22  0 [] -v1043(VarCurr,bitIndex1)|v945(VarCurr).
% 94.85/94.22  0 [] v1043(VarCurr,bitIndex1)| -v945(VarCurr).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1019_range_3_to_0_address_association(VarNext,AssociatedAddressVar)| -address(A)|A!=AssociatedAddressVar| -range_17_0(B)| -v1017(VarNext,B)|v923_array(VarNext,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1019_range_3_to_0_address_association(VarNext,AssociatedAddressVar)| -address(A)|A!=AssociatedAddressVar| -range_17_0(B)|v1017(VarNext,B)| -v923_array(VarNext,A,B).
% 94.85/94.22  0 [] -range_3_0(B)| -v1019(constB0,B)|$F.
% 94.85/94.22  0 [] -range_3_0(B)|v1019(constB0,B)| -$F.
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1009(VarNext)| -range_17_0(B)| -v923_array(VarNext,A,B)|v923_1__array(VarNext,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1009(VarNext)| -range_17_0(B)|v923_array(VarNext,A,B)| -v923_1__array(VarNext,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1009(VarNext)| -range_17_0(B)| -v923_array(VarNext,A,B)|b000000000000000000(B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1009(VarNext)| -range_17_0(B)|v923_array(VarNext,A,B)| -b000000000000000000(B).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex17).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex16).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex15).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex14).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex13).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex12).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex11).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex10).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex9).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex8).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex7).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex6).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex5).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex4).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex3).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex2).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex1).
% 94.85/94.22  0 [] -b000000000000000000(bitIndex0).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1009(VarNext)|v1010(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1009(VarNext)|v1015(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1009(VarNext)| -v1010(VarNext)| -v1015(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1015(VarNext)|v1006(VarCurr).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1015(VarNext)| -v1006(VarCurr).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1010(VarNext)|v1012(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1010(VarNext)|v925(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1010(VarNext)| -v1012(VarNext)| -v925(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1012(VarNext)|v984(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1012(VarNext)| -v984(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A=AssociatedAddressVar| -range_17_0(B)| -v923_1__array(VarNext,A,B)|v923_array(VarCurr,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A=AssociatedAddressVar| -range_17_0(B)|v923_1__array(VarNext,A,B)| -v923_array(VarCurr,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|v997(VarNext)| -range_17_0(B)| -v923_1__array(VarNext,A,B)|v923_array(VarCurr,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|v997(VarNext)| -range_17_0(B)|v923_1__array(VarNext,A,B)| -v923_array(VarCurr,A,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A!=AssociatedAddressVar| -v997(VarNext)| -range_17_0(B)| -v923_1__array(VarNext,A,B)|v930(VarNext,B).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v953_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A!=AssociatedAddressVar| -v997(VarNext)| -range_17_0(B)|v923_1__array(VarNext,A,B)| -v930(VarNext,B).
% 94.85/94.22  0 [] -range_17_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex0!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex1!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex2!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex3!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex4!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex5!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex6!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex7!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex8!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex9!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex10!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex11!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex12!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex13!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex14!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex15!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex16!=B.
% 94.85/94.22  0 [] range_17_0(B)|bitIndex17!=B.
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v997(VarNext)|v998(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v997(VarNext)|v1004(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v997(VarNext)| -v998(VarNext)| -v1004(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v1004(VarNext)|v1002(VarCurr).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v1004(VarNext)| -v1002(VarCurr).
% 94.85/94.22  0 [] -v1002(VarCurr)|v1005(VarCurr).
% 94.85/94.22  0 [] -v1002(VarCurr)|v945(VarCurr).
% 94.85/94.22  0 [] v1002(VarCurr)| -v1005(VarCurr)| -v945(VarCurr).
% 94.85/94.22  0 [] v1005(VarCurr)|v1006(VarCurr).
% 94.85/94.22  0 [] -v1005(VarCurr)| -v1006(VarCurr).
% 94.85/94.22  0 [] v1006(VarCurr)|v928(VarCurr).
% 94.85/94.22  0 [] -v1006(VarCurr)| -v928(VarCurr).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v998(VarNext)|v999(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v998(VarNext)|v925(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v998(VarNext)| -v999(VarNext)| -v925(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)|v999(VarNext)|v984(VarNext).
% 94.85/94.22  0 [] -nextState(VarCurr,VarNext)| -v999(VarNext)| -v984(VarNext).
% 94.85/94.22  0 [] -v923_array(constB0,b1111_address_term,bitIndex0).
% 94.85/94.22  0 [] -v923_array(constB0,b1111_address_term,bitIndex15).
% 94.85/94.22  0 [] -v923_array(constB0,b1111_address_term,bitIndex17).
% 94.85/94.22  0 [] -v923_array(constB0,b1110_address_term,bitIndex0).
% 94.85/94.22  0 [] -v923_array(constB0,b1110_address_term,bitIndex15).
% 94.85/94.22  0 [] -v923_array(constB0,b1110_address_term,bitIndex17).
% 94.85/94.22  0 [] -v923_array(constB0,b1101_address_term,bitIndex0).
% 94.85/94.22  0 [] -v923_array(constB0,b1101_address_term,bitIndex15).
% 94.85/94.22  0 [] -v923_array(constB0,b1101_address_term,bitIndex17).
% 94.85/94.22  0 [] -v923_array(constB0,b1100_address_term,bitIndex0).
% 94.85/94.22  0 [] -v923_array(constB0,b1100_address_term,bitIndex15).
% 94.85/94.22  0 [] -v923_array(constB0,b1100_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b1011_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b1011_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b1011_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b1010_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b1010_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b1010_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b1001_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b1001_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b1001_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b1000_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b1000_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b1000_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0111_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0111_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0111_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0110_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0110_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0110_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0101_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0101_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0101_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0100_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0100_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0100_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0011_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0011_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0011_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0010_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0010_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0010_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0001_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0001_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0001_address_term,bitIndex17).
% 94.85/94.23  0 [] -v923_array(constB0,b0000_address_term,bitIndex0).
% 94.85/94.23  0 [] -v923_array(constB0,b0000_address_term,bitIndex15).
% 94.85/94.23  0 [] -v923_array(constB0,b0000_address_term,bitIndex17).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v980(VarNext)| -range_3_0(B)| -v953(VarNext,B)|v953(VarCurr,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v980(VarNext)| -range_3_0(B)|v953(VarNext,B)| -v953(VarCurr,B).
% 94.85/94.23  0 [] -v980(VarNext)| -range_3_0(B)| -v953(VarNext,B)|v990(VarNext,B).
% 94.85/94.23  0 [] -v980(VarNext)| -range_3_0(B)|v953(VarNext,B)| -v990(VarNext,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v990(VarNext,B)|v988(VarCurr,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v990(VarNext,B)| -v988(VarCurr,B).
% 94.85/94.23  0 [] v991(VarCurr)| -range_3_0(B)| -v988(VarCurr,B)|v955(VarCurr,B).
% 94.85/94.23  0 [] v991(VarCurr)| -range_3_0(B)|v988(VarCurr,B)| -v955(VarCurr,B).
% 94.85/94.23  0 [] -v991(VarCurr)| -range_3_0(B)| -v988(VarCurr,B)|$F.
% 94.85/94.23  0 [] -v991(VarCurr)| -range_3_0(B)|v988(VarCurr,B)| -$F.
% 94.85/94.23  0 [] v991(VarCurr)|v928(VarCurr).
% 94.85/94.23  0 [] -v991(VarCurr)| -v928(VarCurr).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v980(VarNext)|v981(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v980(VarNext)| -v981(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v981(VarNext)|v982(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v981(VarNext)|v925(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v981(VarNext)| -v982(VarNext)| -v925(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v982(VarNext)|v984(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v982(VarNext)| -v984(VarNext).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v984(VarNext)|v925(VarCurr).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v984(VarNext)| -v925(VarCurr).
% 94.85/94.23  0 [] v945(VarCurr)| -range_3_0(B)| -v955(VarCurr,B)|v953(VarCurr,B).
% 94.85/94.23  0 [] v945(VarCurr)| -range_3_0(B)|v955(VarCurr,B)| -v953(VarCurr,B).
% 94.85/94.23  0 [] -v945(VarCurr)| -range_3_0(B)| -v955(VarCurr,B)|v957(VarCurr,B).
% 94.85/94.23  0 [] -v945(VarCurr)| -range_3_0(B)|v955(VarCurr,B)| -v957(VarCurr,B).
% 94.85/94.23  0 [] v958(VarCurr)| -range_3_0(B)| -v957(VarCurr,B)|v959(VarCurr,B).
% 94.85/94.23  0 [] v958(VarCurr)| -range_3_0(B)|v957(VarCurr,B)| -v959(VarCurr,B).
% 94.85/94.23  0 [] -v958(VarCurr)| -range_3_0(B)| -v957(VarCurr,B)|$F.
% 94.85/94.23  0 [] -v958(VarCurr)| -range_3_0(B)|v957(VarCurr,B)| -$F.
% 94.85/94.23  0 [] -v959(VarCurr,bitIndex0)|v975(VarCurr).
% 94.85/94.23  0 [] v959(VarCurr,bitIndex0)| -v975(VarCurr).
% 94.85/94.23  0 [] -v959(VarCurr,bitIndex1)|v973(VarCurr).
% 94.85/94.23  0 [] v959(VarCurr,bitIndex1)| -v973(VarCurr).
% 94.85/94.23  0 [] -v959(VarCurr,bitIndex2)|v968(VarCurr).
% 94.85/94.23  0 [] v959(VarCurr,bitIndex2)| -v968(VarCurr).
% 94.85/94.23  0 [] -v959(VarCurr,bitIndex3)|v961(VarCurr).
% 94.85/94.23  0 [] v959(VarCurr,bitIndex3)| -v961(VarCurr).
% 94.85/94.23  0 [] -v973(VarCurr)|v974(VarCurr).
% 94.85/94.23  0 [] -v973(VarCurr)|v977(VarCurr).
% 94.85/94.23  0 [] v973(VarCurr)| -v974(VarCurr)| -v977(VarCurr).
% 94.85/94.23  0 [] -v977(VarCurr)|v953(VarCurr,bitIndex0)|v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v977(VarCurr)| -v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] v977(VarCurr)| -v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v974(VarCurr)|v975(VarCurr)|v976(VarCurr).
% 94.85/94.23  0 [] v974(VarCurr)| -v975(VarCurr).
% 94.85/94.23  0 [] v974(VarCurr)| -v976(VarCurr).
% 94.85/94.23  0 [] v976(VarCurr)|v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v976(VarCurr)| -v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v975(VarCurr)|v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -v975(VarCurr)| -v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -v968(VarCurr)|v969(VarCurr).
% 94.85/94.23  0 [] -v968(VarCurr)|v972(VarCurr).
% 94.85/94.23  0 [] v968(VarCurr)| -v969(VarCurr)| -v972(VarCurr).
% 94.85/94.23  0 [] -v972(VarCurr)|v965(VarCurr)|v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v972(VarCurr)| -v965(VarCurr).
% 94.85/94.23  0 [] v972(VarCurr)| -v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v969(VarCurr)|v970(VarCurr)|v971(VarCurr).
% 94.85/94.23  0 [] v969(VarCurr)| -v970(VarCurr).
% 94.85/94.23  0 [] v969(VarCurr)| -v971(VarCurr).
% 94.85/94.23  0 [] v971(VarCurr)|v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v971(VarCurr)| -v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v970(VarCurr)|v965(VarCurr).
% 94.85/94.23  0 [] -v970(VarCurr)| -v965(VarCurr).
% 94.85/94.23  0 [] -v961(VarCurr)|v962(VarCurr).
% 94.85/94.23  0 [] -v961(VarCurr)|v967(VarCurr).
% 94.85/94.23  0 [] v961(VarCurr)| -v962(VarCurr)| -v967(VarCurr).
% 94.85/94.23  0 [] -v967(VarCurr)|v964(VarCurr)|v953(VarCurr,bitIndex3).
% 94.85/94.23  0 [] v967(VarCurr)| -v964(VarCurr).
% 94.85/94.23  0 [] v967(VarCurr)| -v953(VarCurr,bitIndex3).
% 94.85/94.23  0 [] -v962(VarCurr)|v963(VarCurr)|v966(VarCurr).
% 94.85/94.23  0 [] v962(VarCurr)| -v963(VarCurr).
% 94.85/94.23  0 [] v962(VarCurr)| -v966(VarCurr).
% 94.85/94.23  0 [] v966(VarCurr)|v953(VarCurr,bitIndex3).
% 94.85/94.23  0 [] -v966(VarCurr)| -v953(VarCurr,bitIndex3).
% 94.85/94.23  0 [] v963(VarCurr)|v964(VarCurr).
% 94.85/94.23  0 [] -v963(VarCurr)| -v964(VarCurr).
% 94.85/94.23  0 [] -v964(VarCurr)|v965(VarCurr).
% 94.85/94.23  0 [] -v964(VarCurr)|v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v964(VarCurr)| -v965(VarCurr)| -v953(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v965(VarCurr)|v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -v965(VarCurr)|v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v965(VarCurr)| -v953(VarCurr,bitIndex0)| -v953(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v958(VarCurr)| -v953(VarCurr,bitIndex3)|$T.
% 94.85/94.23  0 [] -v958(VarCurr)|v953(VarCurr,bitIndex3)| -$T.
% 94.85/94.23  0 [] -v958(VarCurr)| -v953(VarCurr,bitIndex2)|$T.
% 94.85/94.23  0 [] -v958(VarCurr)|v953(VarCurr,bitIndex2)| -$T.
% 94.85/94.23  0 [] -v958(VarCurr)| -v953(VarCurr,bitIndex1)|$T.
% 94.85/94.23  0 [] -v958(VarCurr)|v953(VarCurr,bitIndex1)| -$T.
% 94.85/94.23  0 [] -v958(VarCurr)| -v953(VarCurr,bitIndex0)|$T.
% 94.85/94.23  0 [] -v958(VarCurr)|v953(VarCurr,bitIndex0)| -$T.
% 94.85/94.23  0 [] v958(VarCurr)|v953(VarCurr,bitIndex3)|$T|v953(VarCurr,bitIndex2)|v953(VarCurr,bitIndex1)|v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] v958(VarCurr)| -v953(VarCurr,bitIndex3)| -$T| -v953(VarCurr,bitIndex2)| -v953(VarCurr,bitIndex1)| -v953(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -v953(constB0,bitIndex3).
% 94.85/94.23  0 [] -v953(constB0,bitIndex2).
% 94.85/94.23  0 [] -v953(constB0,bitIndex1).
% 94.85/94.23  0 [] v953(constB0,bitIndex0).
% 94.85/94.23  0 [] -v945(VarCurr)|v947(VarCurr).
% 94.85/94.23  0 [] v945(VarCurr)| -v947(VarCurr).
% 94.85/94.23  0 [] -v947(VarCurr)|v949(VarCurr).
% 94.85/94.23  0 [] v947(VarCurr)| -v949(VarCurr).
% 94.85/94.23  0 [] -v949(VarCurr)|v951(VarCurr).
% 94.85/94.23  0 [] v949(VarCurr)| -v951(VarCurr).
% 94.85/94.23  0 [] -range_15_0(B)| -v930(VarCurr,B)|v938(VarCurr,B).
% 94.85/94.23  0 [] -range_15_0(B)|v930(VarCurr,B)| -v938(VarCurr,B).
% 94.85/94.23  0 [] -v930(VarCurr,bitIndex17)|v932(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v930(VarCurr,bitIndex17)| -v932(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v930(VarCurr,bitIndex16)|v932(VarCurr,bitIndex0).
% 94.85/94.23  0 [] v930(VarCurr,bitIndex16)| -v932(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -range_15_0(B)| -v938(VarCurr,B)|v940(VarCurr,B).
% 94.85/94.23  0 [] -range_15_0(B)|v938(VarCurr,B)| -v940(VarCurr,B).
% 94.85/94.23  0 [] -range_15_0(B)| -v940(VarCurr,B)|v942(VarCurr,B).
% 94.85/94.23  0 [] -range_15_0(B)|v940(VarCurr,B)| -v942(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)| -v932(VarCurr,B)|v934(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v932(VarCurr,B)| -v934(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)| -v934(VarCurr,B)|v936(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v934(VarCurr,B)| -v936(VarCurr,B).
% 94.85/94.23  0 [] -v928(VarCurr)|v12(VarCurr).
% 94.85/94.23  0 [] v928(VarCurr)| -v12(VarCurr).
% 94.85/94.23  0 [] -v925(VarCurr)|v288(VarCurr).
% 94.85/94.23  0 [] v925(VarCurr)| -v288(VarCurr).
% 94.85/94.23  0 [] -v903(VarCurr)|v905(VarCurr).
% 94.85/94.23  0 [] v903(VarCurr)| -v905(VarCurr).
% 94.85/94.23  0 [] -v905(VarCurr)|v91(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v905(VarCurr)| -v91(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v91(VarCurr,bitIndex2)|v898(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v91(VarCurr,bitIndex2)| -v898(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v892(VarCurr,bitIndex2)|v896(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v892(VarCurr,bitIndex2)| -v896(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v894(VarCurr,bitIndex2)|v895(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v894(VarCurr,bitIndex2)| -v895(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v885(VarCurr,bitIndex2)|v889(VarCurr,bitIndex2).
% 94.85/94.23  0 [] v885(VarCurr,bitIndex2)| -v889(VarCurr,bitIndex2).
% 94.85/94.23  0 [] -v887(VarCurr,bitIndex2)|v888(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v887(VarCurr,bitIndex2)| -v888(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v881(VarCurr)|v883(VarCurr).
% 94.85/94.23  0 [] v881(VarCurr)| -v883(VarCurr).
% 94.85/94.23  0 [] -v883(VarCurr)|v91(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v883(VarCurr)| -v91(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -v91(VarCurr,bitIndex1)|v898(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v91(VarCurr,bitIndex1)| -v898(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -range_2_0(B)| -v898(VarCurr,B)|v899(VarCurr,B)|v892(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v898(VarCurr,B)| -v899(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v898(VarCurr,B)| -v892(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v899(VarCurr,B)|v900(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v899(VarCurr,B)|v885(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v899(VarCurr,B)| -v900(VarCurr,B)| -v885(VarCurr,B).
% 94.85/94.23  0 [] -v900(VarCurr,bitIndex0)|v901(VarCurr).
% 94.85/94.23  0 [] v900(VarCurr,bitIndex0)| -v901(VarCurr).
% 94.85/94.23  0 [] -v900(VarCurr,bitIndex1)|v901(VarCurr).
% 94.85/94.23  0 [] v900(VarCurr,bitIndex1)| -v901(VarCurr).
% 94.85/94.23  0 [] -v900(VarCurr,bitIndex2)|v901(VarCurr).
% 94.85/94.23  0 [] v900(VarCurr,bitIndex2)| -v901(VarCurr).
% 94.85/94.23  0 [] -v901(VarCurr)|v93(VarCurr).
% 94.85/94.23  0 [] v901(VarCurr)| -v93(VarCurr).
% 94.85/94.23  0 [] -v892(VarCurr,bitIndex1)|v896(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v892(VarCurr,bitIndex1)| -v896(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -range_2_0(B)| -v896(VarCurr,B)|v95(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v896(VarCurr,B)|v897(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v896(VarCurr,B)| -v95(VarCurr,B)| -v897(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v897(VarCurr,B)| -v894(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v897(VarCurr,B)|v894(VarCurr,B).
% 94.85/94.23  0 [] -v894(VarCurr,bitIndex1)|v895(VarCurr,bitIndex0).
% 94.85/94.23  0 [] v894(VarCurr,bitIndex1)| -v895(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -range_1_0(B)| -v895(VarCurr,B)|v894(VarCurr,B)|v95(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v895(VarCurr,B)| -v894(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v895(VarCurr,B)| -v95(VarCurr,B).
% 94.85/94.23  0 [] -v894(VarCurr,bitIndex0)|$F.
% 94.85/94.23  0 [] v894(VarCurr,bitIndex0)| -$F.
% 94.85/94.23  0 [] -v885(VarCurr,bitIndex1)|v889(VarCurr,bitIndex1).
% 94.85/94.23  0 [] v885(VarCurr,bitIndex1)| -v889(VarCurr,bitIndex1).
% 94.85/94.23  0 [] -range_2_0(B)| -v889(VarCurr,B)|v97(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v889(VarCurr,B)|v890(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v889(VarCurr,B)| -v97(VarCurr,B)| -v890(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)| -v890(VarCurr,B)| -v887(VarCurr,B).
% 94.85/94.23  0 [] -range_2_0(B)|v890(VarCurr,B)|v887(VarCurr,B).
% 94.85/94.23  0 [] -v887(VarCurr,bitIndex1)|v888(VarCurr,bitIndex0).
% 94.85/94.23  0 [] v887(VarCurr,bitIndex1)| -v888(VarCurr,bitIndex0).
% 94.85/94.23  0 [] -range_1_0(B)| -v888(VarCurr,B)|v887(VarCurr,B)|v97(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v888(VarCurr,B)| -v887(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|v888(VarCurr,B)| -v97(VarCurr,B).
% 94.85/94.23  0 [] -range_1_0(B)|bitIndex0=B|bitIndex1=B.
% 94.85/94.23  0 [] range_1_0(B)|bitIndex0!=B.
% 94.85/94.23  0 [] range_1_0(B)|bitIndex1!=B.
% 94.85/94.23  0 [] -v887(VarCurr,bitIndex0)|$F.
% 94.85/94.23  0 [] v887(VarCurr,bitIndex0)| -$F.
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v869_range_3_to_0_address_association(VarNext,AssociatedAddressVar)| -address(A)|A!=AssociatedAddressVar| -range_66_0(B)| -v867(VarNext,B)|v749_array(VarNext,A,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v869_range_3_to_0_address_association(VarNext,AssociatedAddressVar)| -address(A)|A!=AssociatedAddressVar| -range_66_0(B)|v867(VarNext,B)| -v749_array(VarNext,A,B).
% 94.85/94.23  0 [] -range_3_0(B)| -v869(constB0,B)|$F.
% 94.85/94.23  0 [] -range_3_0(B)|v869(constB0,B)| -$F.
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v859(VarNext)| -range_66_0(B)| -v749_array(VarNext,A,B)|v749_1__array(VarNext,A,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)|v859(VarNext)| -range_66_0(B)|v749_array(VarNext,A,B)| -v749_1__array(VarNext,A,B).
% 94.85/94.23  0 [] -nextState(VarCurr,VarNext)| -v859(VarNext)| -range_66_0(B)| -v749_array(VarNext,A,B)|b0000000000000000000000000000000000000000000000000000000000000000000(B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v859(VarNext)| -range_66_0(B)|v749_array(VarNext,A,B)| -b0000000000000000000000000000000000000000000000000000000000000000000(B).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex66).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex65).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex64).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex63).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex62).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex61).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex60).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex59).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex58).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex57).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex56).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex55).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex54).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex53).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex52).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex51).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex50).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex49).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex48).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex47).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex46).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex45).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex44).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex43).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex42).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex41).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex40).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex39).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex38).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex37).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex36).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex35).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex34).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex33).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex32).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex31).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex30).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex29).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex28).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex27).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex26).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex25).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex24).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex23).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex22).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex21).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex20).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex19).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex18).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex17).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex16).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex15).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex14).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex13).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex12).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex11).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex10).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex9).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex8).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex7).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex6).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex5).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex4).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex3).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex2).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex1).
% 94.85/94.24  0 [] -b0000000000000000000000000000000000000000000000000000000000000000000(bitIndex0).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v859(VarNext)|v860(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v859(VarNext)|v865(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v859(VarNext)| -v860(VarNext)| -v865(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v865(VarNext)|v856(VarCurr).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v865(VarNext)| -v856(VarCurr).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v860(VarNext)|v862(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v860(VarNext)|v751(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v860(VarNext)| -v862(VarNext)| -v751(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v862(VarNext)|v823(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v862(VarNext)| -v823(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A=AssociatedAddressVar| -range_66_0(B)| -v749_1__array(VarNext,A,B)|v749_array(VarCurr,A,B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A=AssociatedAddressVar| -range_66_0(B)|v749_1__array(VarNext,A,B)| -v749_array(VarCurr,A,B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|v847(VarNext)| -range_66_0(B)| -v749_1__array(VarNext,A,B)|v749_array(VarCurr,A,B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|v847(VarNext)| -range_66_0(B)|v749_1__array(VarNext,A,B)| -v749_array(VarCurr,A,B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A!=AssociatedAddressVar| -v847(VarNext)| -range_66_0(B)| -v749_1__array(VarNext,A,B)|v756(VarNext,B).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v791_range_3_to_0_address_association(VarNext,AssociatedAddressVar)|A!=AssociatedAddressVar| -v847(VarNext)| -range_66_0(B)|v749_1__array(VarNext,A,B)| -v756(VarNext,B).
% 94.85/94.24  0 [] -range_66_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B|bitIndex36=B|bitIndex37=B|bitIndex38=B|bitIndex39=B|bitIndex40=B|bitIndex41=B|bitIndex42=B|bitIndex43=B|bitIndex44=B|bitIndex45=B|bitIndex46=B|bitIndex47=B|bitIndex48=B|bitIndex49=B|bitIndex50=B|bitIndex51=B|bitIndex52=B|bitIndex53=B|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B|bitIndex61=B|bitIndex62=B|bitIndex63=B|bitIndex64=B|bitIndex65=B|bitIndex66=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex0!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex1!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex2!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex3!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex4!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex5!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex6!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex7!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex8!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex9!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex10!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex11!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex12!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex13!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex14!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex15!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex16!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex17!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex18!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex19!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex20!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex21!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex22!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex23!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex24!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex25!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex26!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex27!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex28!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex29!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex30!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex31!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex32!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex33!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex34!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex35!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex36!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex37!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex38!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex39!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex40!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex41!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex42!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex43!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex44!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex45!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex46!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex47!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex48!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex49!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex50!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex51!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex52!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex53!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex54!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex55!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex56!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex57!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex58!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex59!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex60!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex61!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex62!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex63!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex64!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex65!=B.
% 94.85/94.24  0 [] range_66_0(B)|bitIndex66!=B.
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v847(VarNext)|v848(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v847(VarNext)|v854(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v847(VarNext)| -v848(VarNext)| -v854(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v854(VarNext)|v852(VarCurr).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v854(VarNext)| -v852(VarCurr).
% 94.85/94.24  0 [] -v852(VarCurr)|v855(VarCurr).
% 94.85/94.24  0 [] -v852(VarCurr)|v783(VarCurr).
% 94.85/94.24  0 [] v852(VarCurr)| -v855(VarCurr)| -v783(VarCurr).
% 94.85/94.24  0 [] v855(VarCurr)|v856(VarCurr).
% 94.85/94.24  0 [] -v855(VarCurr)| -v856(VarCurr).
% 94.85/94.24  0 [] v856(VarCurr)|v754(VarCurr).
% 94.85/94.24  0 [] -v856(VarCurr)| -v754(VarCurr).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v848(VarNext)|v849(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v848(VarNext)|v751(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v848(VarNext)| -v849(VarNext)| -v751(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)|v849(VarNext)|v823(VarNext).
% 94.85/94.24  0 [] -nextState(VarCurr,VarNext)| -v849(VarNext)| -v823(VarNext).
% 94.85/94.24  0 [] -v749_array(constB0,b1111_address_term,bitIndex63).
% 94.85/94.24  0 [] -v749_array(constB0,b1111_address_term,bitIndex64).
% 94.85/94.24  0 [] -v749_array(constB0,b1111_address_term,bitIndex65).
% 94.85/94.24  0 [] -v749_array(constB0,b1111_address_term,bitIndex66).
% 94.91/94.25  0 [] -v749_array(constB0,b1110_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1110_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1110_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1110_address_term,bitIndex66).
% 94.91/94.25  0 [] b1110(bitIndex3).
% 94.91/94.25  0 [] b1110(bitIndex2).
% 94.91/94.25  0 [] b1110(bitIndex1).
% 94.91/94.25  0 [] -b1110(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1101_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1101_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1101_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1101_address_term,bitIndex66).
% 94.91/94.25  0 [] b1101(bitIndex3).
% 94.91/94.25  0 [] b1101(bitIndex2).
% 94.91/94.25  0 [] -b1101(bitIndex1).
% 94.91/94.25  0 [] b1101(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1100_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1100_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1100_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1100_address_term,bitIndex66).
% 94.91/94.25  0 [] b1100(bitIndex3).
% 94.91/94.25  0 [] b1100(bitIndex2).
% 94.91/94.25  0 [] -b1100(bitIndex1).
% 94.91/94.25  0 [] -b1100(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1011_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1011_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1011_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1011_address_term,bitIndex66).
% 94.91/94.25  0 [] b1011(bitIndex3).
% 94.91/94.25  0 [] -b1011(bitIndex2).
% 94.91/94.25  0 [] b1011(bitIndex1).
% 94.91/94.25  0 [] b1011(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1010_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1010_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1010_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1010_address_term,bitIndex66).
% 94.91/94.25  0 [] b1010(bitIndex3).
% 94.91/94.25  0 [] -b1010(bitIndex2).
% 94.91/94.25  0 [] b1010(bitIndex1).
% 94.91/94.25  0 [] -b1010(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1001_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1001_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1001_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1001_address_term,bitIndex66).
% 94.91/94.25  0 [] b1001(bitIndex3).
% 94.91/94.25  0 [] -b1001(bitIndex2).
% 94.91/94.25  0 [] -b1001(bitIndex1).
% 94.91/94.25  0 [] b1001(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b1000_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b1000_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b1000_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b1000_address_term,bitIndex66).
% 94.91/94.25  0 [] b1000(bitIndex3).
% 94.91/94.25  0 [] -b1000(bitIndex2).
% 94.91/94.25  0 [] -b1000(bitIndex1).
% 94.91/94.25  0 [] -b1000(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0111_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0111_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0111_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0111_address_term,bitIndex66).
% 94.91/94.25  0 [] -b0111(bitIndex3).
% 94.91/94.25  0 [] b0111(bitIndex2).
% 94.91/94.25  0 [] b0111(bitIndex1).
% 94.91/94.25  0 [] b0111(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0110_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0110_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0110_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0110_address_term,bitIndex66).
% 94.91/94.25  0 [] -v749_array(constB0,b0101_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0101_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0101_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0101_address_term,bitIndex66).
% 94.91/94.25  0 [] -b0101(bitIndex3).
% 94.91/94.25  0 [] b0101(bitIndex2).
% 94.91/94.25  0 [] -b0101(bitIndex1).
% 94.91/94.25  0 [] b0101(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0100_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0100_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0100_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0100_address_term,bitIndex66).
% 94.91/94.25  0 [] -b0100(bitIndex3).
% 94.91/94.25  0 [] b0100(bitIndex2).
% 94.91/94.25  0 [] -b0100(bitIndex1).
% 94.91/94.25  0 [] -b0100(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0011_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0011_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0011_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0011_address_term,bitIndex66).
% 94.91/94.25  0 [] -b0011(bitIndex3).
% 94.91/94.25  0 [] -b0011(bitIndex2).
% 94.91/94.25  0 [] b0011(bitIndex1).
% 94.91/94.25  0 [] b0011(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0010_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0010_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0010_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0010_address_term,bitIndex66).
% 94.91/94.25  0 [] -b0010(bitIndex3).
% 94.91/94.25  0 [] -b0010(bitIndex2).
% 94.91/94.25  0 [] b0010(bitIndex1).
% 94.91/94.25  0 [] -b0010(bitIndex0).
% 94.91/94.25  0 [] -v749_array(constB0,b0001_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0001_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0001_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0001_address_term,bitIndex66).
% 94.91/94.25  0 [] -v749_array(constB0,b0000_address_term,bitIndex63).
% 94.91/94.25  0 [] -v749_array(constB0,b0000_address_term,bitIndex64).
% 94.91/94.25  0 [] -v749_array(constB0,b0000_address_term,bitIndex65).
% 94.91/94.25  0 [] -v749_array(constB0,b0000_address_term,bitIndex66).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v819(VarNext)| -range_3_0(B)| -v791(VarNext,B)|v791(VarCurr,B).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v819(VarNext)| -range_3_0(B)|v791(VarNext,B)| -v791(VarCurr,B).
% 94.91/94.25  0 [] -v819(VarNext)| -range_3_0(B)| -v791(VarNext,B)|v829(VarNext,B).
% 94.91/94.25  0 [] -v819(VarNext)| -range_3_0(B)|v791(VarNext,B)| -v829(VarNext,B).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v829(VarNext,B)|v827(VarCurr,B).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v829(VarNext,B)| -v827(VarCurr,B).
% 94.91/94.25  0 [] v830(VarCurr)| -range_3_0(B)| -v827(VarCurr,B)|v793(VarCurr,B).
% 94.91/94.25  0 [] v830(VarCurr)| -range_3_0(B)|v827(VarCurr,B)| -v793(VarCurr,B).
% 94.91/94.25  0 [] -v830(VarCurr)| -range_3_0(B)| -v827(VarCurr,B)|$F.
% 94.91/94.25  0 [] -v830(VarCurr)| -range_3_0(B)|v827(VarCurr,B)| -$F.
% 94.91/94.25  0 [] v830(VarCurr)|v754(VarCurr).
% 94.91/94.25  0 [] -v830(VarCurr)| -v754(VarCurr).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -v819(VarNext)|v820(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v819(VarNext)| -v820(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -v820(VarNext)|v821(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -v820(VarNext)|v751(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v820(VarNext)| -v821(VarNext)| -v751(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v821(VarNext)|v823(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -v821(VarNext)| -v823(VarNext).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)| -v823(VarNext)|v751(VarCurr).
% 94.91/94.25  0 [] -nextState(VarCurr,VarNext)|v823(VarNext)| -v751(VarCurr).
% 94.91/94.25  0 [] v783(VarCurr)| -range_3_0(B)| -v793(VarCurr,B)|v791(VarCurr,B).
% 94.91/94.25  0 [] v783(VarCurr)| -range_3_0(B)|v793(VarCurr,B)| -v791(VarCurr,B).
% 94.91/94.25  0 [] -v783(VarCurr)| -range_3_0(B)| -v793(VarCurr,B)|v796(VarCurr,B).
% 94.91/94.25  0 [] -v783(VarCurr)| -range_3_0(B)|v793(VarCurr,B)| -v796(VarCurr,B).
% 94.91/94.25  0 [] v797(VarCurr)| -range_3_0(B)| -v796(VarCurr,B)|v798(VarCurr,B).
% 94.91/94.25  0 [] v797(VarCurr)| -range_3_0(B)|v796(VarCurr,B)| -v798(VarCurr,B).
% 94.91/94.25  0 [] -v797(VarCurr)| -range_3_0(B)| -v796(VarCurr,B)|$F.
% 94.91/94.25  0 [] -v797(VarCurr)| -range_3_0(B)|v796(VarCurr,B)| -$F.
% 94.91/94.25  0 [] -v798(VarCurr,bitIndex0)|v814(VarCurr).
% 94.91/94.25  0 [] v798(VarCurr,bitIndex0)| -v814(VarCurr).
% 94.91/94.25  0 [] -v798(VarCurr,bitIndex1)|v812(VarCurr).
% 94.91/94.25  0 [] v798(VarCurr,bitIndex1)| -v812(VarCurr).
% 94.91/94.25  0 [] -v798(VarCurr,bitIndex2)|v807(VarCurr).
% 94.91/94.25  0 [] v798(VarCurr,bitIndex2)| -v807(VarCurr).
% 94.91/94.25  0 [] -v798(VarCurr,bitIndex3)|v800(VarCurr).
% 94.91/94.25  0 [] v798(VarCurr,bitIndex3)| -v800(VarCurr).
% 94.91/94.25  0 [] -v812(VarCurr)|v813(VarCurr).
% 94.91/94.25  0 [] -v812(VarCurr)|v816(VarCurr).
% 94.91/94.25  0 [] v812(VarCurr)| -v813(VarCurr)| -v816(VarCurr).
% 94.91/94.25  0 [] -v816(VarCurr)|v791(VarCurr,bitIndex0)|v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] v816(VarCurr)| -v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] v816(VarCurr)| -v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] -v813(VarCurr)|v814(VarCurr)|v815(VarCurr).
% 94.91/94.25  0 [] v813(VarCurr)| -v814(VarCurr).
% 94.91/94.25  0 [] v813(VarCurr)| -v815(VarCurr).
% 94.91/94.25  0 [] v815(VarCurr)|v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] -v815(VarCurr)| -v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] v814(VarCurr)|v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] -v814(VarCurr)| -v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] -v807(VarCurr)|v808(VarCurr).
% 94.91/94.25  0 [] -v807(VarCurr)|v811(VarCurr).
% 94.91/94.25  0 [] v807(VarCurr)| -v808(VarCurr)| -v811(VarCurr).
% 94.91/94.25  0 [] -v811(VarCurr)|v804(VarCurr)|v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] v811(VarCurr)| -v804(VarCurr).
% 94.91/94.25  0 [] v811(VarCurr)| -v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] -v808(VarCurr)|v809(VarCurr)|v810(VarCurr).
% 94.91/94.25  0 [] v808(VarCurr)| -v809(VarCurr).
% 94.91/94.25  0 [] v808(VarCurr)| -v810(VarCurr).
% 94.91/94.25  0 [] v810(VarCurr)|v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] -v810(VarCurr)| -v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] v809(VarCurr)|v804(VarCurr).
% 94.91/94.25  0 [] -v809(VarCurr)| -v804(VarCurr).
% 94.91/94.25  0 [] -v800(VarCurr)|v801(VarCurr).
% 94.91/94.25  0 [] -v800(VarCurr)|v806(VarCurr).
% 94.91/94.25  0 [] v800(VarCurr)| -v801(VarCurr)| -v806(VarCurr).
% 94.91/94.25  0 [] -v806(VarCurr)|v803(VarCurr)|v791(VarCurr,bitIndex3).
% 94.91/94.25  0 [] v806(VarCurr)| -v803(VarCurr).
% 94.91/94.25  0 [] v806(VarCurr)| -v791(VarCurr,bitIndex3).
% 94.91/94.25  0 [] -v801(VarCurr)|v802(VarCurr)|v805(VarCurr).
% 94.91/94.25  0 [] v801(VarCurr)| -v802(VarCurr).
% 94.91/94.25  0 [] v801(VarCurr)| -v805(VarCurr).
% 94.91/94.25  0 [] v805(VarCurr)|v791(VarCurr,bitIndex3).
% 94.91/94.25  0 [] -v805(VarCurr)| -v791(VarCurr,bitIndex3).
% 94.91/94.25  0 [] v802(VarCurr)|v803(VarCurr).
% 94.91/94.25  0 [] -v802(VarCurr)| -v803(VarCurr).
% 94.91/94.25  0 [] -v803(VarCurr)|v804(VarCurr).
% 94.91/94.25  0 [] -v803(VarCurr)|v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] v803(VarCurr)| -v804(VarCurr)| -v791(VarCurr,bitIndex2).
% 94.91/94.25  0 [] -v804(VarCurr)|v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] -v804(VarCurr)|v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] v804(VarCurr)| -v791(VarCurr,bitIndex0)| -v791(VarCurr,bitIndex1).
% 94.91/94.25  0 [] -v797(VarCurr)| -v791(VarCurr,bitIndex3)|$T.
% 94.91/94.25  0 [] -v797(VarCurr)|v791(VarCurr,bitIndex3)| -$T.
% 94.91/94.25  0 [] -v797(VarCurr)| -v791(VarCurr,bitIndex2)|$T.
% 94.91/94.25  0 [] -v797(VarCurr)|v791(VarCurr,bitIndex2)| -$T.
% 94.91/94.25  0 [] -v797(VarCurr)| -v791(VarCurr,bitIndex1)|$T.
% 94.91/94.25  0 [] -v797(VarCurr)|v791(VarCurr,bitIndex1)| -$T.
% 94.91/94.25  0 [] -v797(VarCurr)| -v791(VarCurr,bitIndex0)|$T.
% 94.91/94.25  0 [] -v797(VarCurr)|v791(VarCurr,bitIndex0)| -$T.
% 94.91/94.25  0 [] v797(VarCurr)|v791(VarCurr,bitIndex3)|$T|v791(VarCurr,bitIndex2)|v791(VarCurr,bitIndex1)|v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] v797(VarCurr)| -v791(VarCurr,bitIndex3)| -$T| -v791(VarCurr,bitIndex2)| -v791(VarCurr,bitIndex1)| -v791(VarCurr,bitIndex0).
% 94.91/94.25  0 [] b1111(bitIndex3).
% 94.91/94.25  0 [] b1111(bitIndex2).
% 94.91/94.25  0 [] b1111(bitIndex1).
% 94.91/94.25  0 [] b1111(bitIndex0).
% 94.91/94.25  0 [] -v791(constB0,bitIndex3).
% 94.91/94.25  0 [] -v791(constB0,bitIndex2).
% 94.91/94.25  0 [] -v791(constB0,bitIndex1).
% 94.91/94.25  0 [] v791(constB0,bitIndex0).
% 94.91/94.25  0 [] -b0001(bitIndex3).
% 94.91/94.25  0 [] -b0001(bitIndex2).
% 94.91/94.25  0 [] -b0001(bitIndex1).
% 94.91/94.25  0 [] b0001(bitIndex0).
% 94.91/94.25  0 [] -v783(VarCurr)|v785(VarCurr).
% 94.91/94.25  0 [] v783(VarCurr)| -v785(VarCurr).
% 94.91/94.25  0 [] -v785(VarCurr)|v787(VarCurr).
% 94.91/94.25  0 [] v785(VarCurr)| -v787(VarCurr).
% 94.91/94.25  0 [] -v787(VarCurr)|v789(VarCurr).
% 94.91/94.25  0 [] v787(VarCurr)| -v789(VarCurr).
% 94.91/94.25  0 [] -range_10_0(B)| -v756(VarCurr,B)|v776(VarCurr,B).
% 94.91/94.25  0 [] -range_10_0(B)|v756(VarCurr,B)| -v776(VarCurr,B).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex26)|v770(VarCurr,bitIndex15).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex26)| -v770(VarCurr,bitIndex15).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex25)|v770(VarCurr,bitIndex14).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex25)| -v770(VarCurr,bitIndex14).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex24)|v770(VarCurr,bitIndex13).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex24)| -v770(VarCurr,bitIndex13).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex23)|v770(VarCurr,bitIndex12).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex23)| -v770(VarCurr,bitIndex12).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex22)|v770(VarCurr,bitIndex11).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex22)| -v770(VarCurr,bitIndex11).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex21)|v770(VarCurr,bitIndex10).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex21)| -v770(VarCurr,bitIndex10).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex20)|v770(VarCurr,bitIndex9).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex20)| -v770(VarCurr,bitIndex9).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex19)|v770(VarCurr,bitIndex8).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex19)| -v770(VarCurr,bitIndex8).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex18)|v770(VarCurr,bitIndex7).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex18)| -v770(VarCurr,bitIndex7).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex17)|v770(VarCurr,bitIndex6).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex17)| -v770(VarCurr,bitIndex6).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex16)|v770(VarCurr,bitIndex5).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex16)| -v770(VarCurr,bitIndex5).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex15)|v770(VarCurr,bitIndex4).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex15)| -v770(VarCurr,bitIndex4).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex14)|v770(VarCurr,bitIndex3).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex14)| -v770(VarCurr,bitIndex3).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex13)|v770(VarCurr,bitIndex2).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex13)| -v770(VarCurr,bitIndex2).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex12)|v770(VarCurr,bitIndex1).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex12)| -v770(VarCurr,bitIndex1).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex11)|v770(VarCurr,bitIndex0).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex11)| -v770(VarCurr,bitIndex0).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex62)|v764(VarCurr,bitIndex35).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex62)| -v764(VarCurr,bitIndex35).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex61)|v764(VarCurr,bitIndex34).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex61)| -v764(VarCurr,bitIndex34).
% 94.91/94.25  0 [] -v756(VarCurr,bitIndex60)|v764(VarCurr,bitIndex33).
% 94.91/94.25  0 [] v756(VarCurr,bitIndex60)| -v764(VarCurr,bitIndex33).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex59)|v764(VarCurr,bitIndex32).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex59)| -v764(VarCurr,bitIndex32).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex58)|v764(VarCurr,bitIndex31).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex58)| -v764(VarCurr,bitIndex31).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex57)|v764(VarCurr,bitIndex30).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex57)| -v764(VarCurr,bitIndex30).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex56)|v764(VarCurr,bitIndex29).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex56)| -v764(VarCurr,bitIndex29).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex55)|v764(VarCurr,bitIndex28).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex55)| -v764(VarCurr,bitIndex28).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex54)|v764(VarCurr,bitIndex27).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex54)| -v764(VarCurr,bitIndex27).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex53)|v764(VarCurr,bitIndex26).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex53)| -v764(VarCurr,bitIndex26).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex52)|v764(VarCurr,bitIndex25).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex52)| -v764(VarCurr,bitIndex25).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex51)|v764(VarCurr,bitIndex24).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex51)| -v764(VarCurr,bitIndex24).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex50)|v764(VarCurr,bitIndex23).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex50)| -v764(VarCurr,bitIndex23).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex49)|v764(VarCurr,bitIndex22).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex49)| -v764(VarCurr,bitIndex22).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex48)|v764(VarCurr,bitIndex21).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex48)| -v764(VarCurr,bitIndex21).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex47)|v764(VarCurr,bitIndex20).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex47)| -v764(VarCurr,bitIndex20).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex46)|v764(VarCurr,bitIndex19).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex46)| -v764(VarCurr,bitIndex19).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex45)|v764(VarCurr,bitIndex18).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex45)| -v764(VarCurr,bitIndex18).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex44)|v764(VarCurr,bitIndex17).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex44)| -v764(VarCurr,bitIndex17).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex43)|v764(VarCurr,bitIndex16).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex43)| -v764(VarCurr,bitIndex16).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex42)|v764(VarCurr,bitIndex15).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex42)| -v764(VarCurr,bitIndex15).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex41)|v764(VarCurr,bitIndex14).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex41)| -v764(VarCurr,bitIndex14).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex40)|v764(VarCurr,bitIndex13).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex40)| -v764(VarCurr,bitIndex13).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex39)|v764(VarCurr,bitIndex12).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex39)| -v764(VarCurr,bitIndex12).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex38)|v764(VarCurr,bitIndex11).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex38)| -v764(VarCurr,bitIndex11).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex37)|v764(VarCurr,bitIndex10).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex37)| -v764(VarCurr,bitIndex10).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex36)|v764(VarCurr,bitIndex9).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex36)| -v764(VarCurr,bitIndex9).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex35)|v764(VarCurr,bitIndex8).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex35)| -v764(VarCurr,bitIndex8).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex34)|v764(VarCurr,bitIndex7).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex34)| -v764(VarCurr,bitIndex7).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex33)|v764(VarCurr,bitIndex6).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex33)| -v764(VarCurr,bitIndex6).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex32)|v764(VarCurr,bitIndex5).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex32)| -v764(VarCurr,bitIndex5).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex31)|v764(VarCurr,bitIndex4).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex31)| -v764(VarCurr,bitIndex4).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex30)|v764(VarCurr,bitIndex3).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex30)| -v764(VarCurr,bitIndex3).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex29)|v764(VarCurr,bitIndex2).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex29)| -v764(VarCurr,bitIndex2).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex28)|v764(VarCurr,bitIndex1).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex28)| -v764(VarCurr,bitIndex1).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex27)|v764(VarCurr,bitIndex0).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex27)| -v764(VarCurr,bitIndex0).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex66)|v758(VarCurr,bitIndex3).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex66)| -v758(VarCurr,bitIndex3).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex65)|v758(VarCurr,bitIndex2).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex65)| -v758(VarCurr,bitIndex2).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex64)|v758(VarCurr,bitIndex1).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex64)| -v758(VarCurr,bitIndex1).
% 94.91/94.26  0 [] -v756(VarCurr,bitIndex63)|v758(VarCurr,bitIndex0).
% 94.91/94.26  0 [] v756(VarCurr,bitIndex63)| -v758(VarCurr,bitIndex0).
% 94.91/94.26  0 [] -range_10_0(B)| -v776(VarCurr,B)|v778(VarCurr,B).
% 94.91/94.26  0 [] -range_10_0(B)|v776(VarCurr,B)| -v778(VarCurr,B).
% 94.91/94.26  0 [] -range_10_0(B)| -v778(VarCurr,B)|v780(VarCurr,B).
% 94.91/94.26  0 [] -range_10_0(B)|v778(VarCurr,B)| -v780(VarCurr,B).
% 94.91/94.26  0 [] -range_10_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex0!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex1!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex2!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex3!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex4!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex5!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex6!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex7!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex8!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex9!=B.
% 94.91/94.26  0 [] range_10_0(B)|bitIndex10!=B.
% 94.91/94.26  0 [] -range_15_0(B)| -v770(VarCurr,B)|v772(VarCurr,B).
% 94.91/94.26  0 [] -range_15_0(B)|v770(VarCurr,B)| -v772(VarCurr,B).
% 94.91/94.26  0 [] -range_15_0(B)| -v772(VarCurr,B)|v774(VarCurr,B).
% 94.91/94.26  0 [] -range_15_0(B)|v772(VarCurr,B)| -v774(VarCurr,B).
% 94.91/94.26  0 [] -range_15_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex0!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex1!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex2!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex3!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex4!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex5!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex6!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex7!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex8!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex9!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex10!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex11!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex12!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex13!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex14!=B.
% 94.91/94.26  0 [] range_15_0(B)|bitIndex15!=B.
% 94.91/94.26  0 [] -range_35_0(B)| -v764(VarCurr,B)|v766(VarCurr,B).
% 94.91/94.26  0 [] -range_35_0(B)|v764(VarCurr,B)| -v766(VarCurr,B).
% 94.91/94.26  0 [] -range_35_0(B)| -v766(VarCurr,B)|v768(VarCurr,B).
% 94.91/94.26  0 [] -range_35_0(B)|v766(VarCurr,B)| -v768(VarCurr,B).
% 94.91/94.26  0 [] -range_35_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B|bitIndex32=B|bitIndex33=B|bitIndex34=B|bitIndex35=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex0!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex1!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex2!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex3!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex4!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex5!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex6!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex7!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex8!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex9!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex10!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex11!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex12!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex13!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex14!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex15!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex16!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex17!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex18!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex19!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex20!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex21!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex22!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex23!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex24!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex25!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex26!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex27!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex28!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex29!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex30!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex31!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex32!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex33!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex34!=B.
% 94.91/94.26  0 [] range_35_0(B)|bitIndex35!=B.
% 94.91/94.26  0 [] -range_3_0(B)| -v758(VarCurr,B)|v760(VarCurr,B).
% 94.91/94.26  0 [] -range_3_0(B)|v758(VarCurr,B)| -v760(VarCurr,B).
% 94.91/94.26  0 [] -range_3_0(B)| -v760(VarCurr,B)|v762(VarCurr,B).
% 94.91/94.26  0 [] -range_3_0(B)|v760(VarCurr,B)| -v762(VarCurr,B).
% 94.91/94.26  0 [] -v754(VarCurr)|v12(VarCurr).
% 94.91/94.26  0 [] v754(VarCurr)| -v12(VarCurr).
% 94.91/94.26  0 [] -v751(VarCurr)|v288(VarCurr).
% 94.91/94.27  0 [] v751(VarCurr)| -v288(VarCurr).
% 94.91/94.27  0 [] -v664(VarCurr)|v666(VarCurr).
% 94.91/94.27  0 [] v664(VarCurr)| -v666(VarCurr).
% 94.91/94.27  0 [] -v666(VarCurr)|v668(VarCurr).
% 94.91/94.27  0 [] v666(VarCurr)| -v668(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v724(VarNext)| -v668(VarNext)|v668(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v724(VarNext)|v668(VarNext)| -v668(VarCurr).
% 94.91/94.27  0 [] -v724(VarNext)| -v668(VarNext)|v734(VarNext).
% 94.91/94.27  0 [] -v724(VarNext)|v668(VarNext)| -v734(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v734(VarNext)|v732(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v734(VarNext)| -v732(VarCurr).
% 94.91/94.27  0 [] v735(VarCurr)| -v732(VarCurr)|x697(VarCurr).
% 94.91/94.27  0 [] v735(VarCurr)|v732(VarCurr)| -x697(VarCurr).
% 94.91/94.27  0 [] -v735(VarCurr)| -v732(VarCurr)|v678(VarCurr).
% 94.91/94.27  0 [] -v735(VarCurr)|v732(VarCurr)| -v678(VarCurr).
% 94.91/94.27  0 [] -v735(VarCurr)|v736(VarCurr).
% 94.91/94.27  0 [] -v735(VarCurr)|v737(VarCurr).
% 94.91/94.27  0 [] v735(VarCurr)| -v736(VarCurr)| -v737(VarCurr).
% 94.91/94.27  0 [] v737(VarCurr)|v674(VarCurr).
% 94.91/94.27  0 [] -v737(VarCurr)| -v674(VarCurr).
% 94.91/94.27  0 [] v736(VarCurr)|v670(VarCurr).
% 94.91/94.27  0 [] -v736(VarCurr)| -v670(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v724(VarNext)|v725(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v724(VarNext)| -v725(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v725(VarNext)|v726(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v725(VarNext)|v721(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v725(VarNext)| -v726(VarNext)| -v721(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v726(VarNext)|v728(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v726(VarNext)| -v728(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v728(VarNext)|v721(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v728(VarNext)| -v721(VarCurr).
% 94.91/94.27  0 [] -v721(VarCurr)|v701(VarCurr).
% 94.91/94.27  0 [] v721(VarCurr)| -v701(VarCurr).
% 94.91/94.27  0 [] -v678(VarCurr)|v680(VarCurr).
% 94.91/94.27  0 [] v678(VarCurr)| -v680(VarCurr).
% 94.91/94.27  0 [] -v680(VarCurr)|v682(VarCurr).
% 94.91/94.27  0 [] v680(VarCurr)| -v682(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v704(VarNext)| -v682(VarNext)|v682(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v704(VarNext)|v682(VarNext)| -v682(VarCurr).
% 94.91/94.27  0 [] -v704(VarNext)| -v682(VarNext)|v714(VarNext).
% 94.91/94.27  0 [] -v704(VarNext)|v682(VarNext)| -v714(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v714(VarNext)|v712(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v714(VarNext)| -v712(VarCurr).
% 94.91/94.27  0 [] v715(VarCurr)| -v712(VarCurr)|x697(VarCurr).
% 94.91/94.27  0 [] v715(VarCurr)|v712(VarCurr)| -x697(VarCurr).
% 94.91/94.27  0 [] -v715(VarCurr)| -v712(VarCurr)|v688(VarCurr).
% 94.91/94.27  0 [] -v715(VarCurr)|v712(VarCurr)| -v688(VarCurr).
% 94.91/94.27  0 [] -v715(VarCurr)|v716(VarCurr).
% 94.91/94.27  0 [] -v715(VarCurr)|v717(VarCurr).
% 94.91/94.27  0 [] v715(VarCurr)| -v716(VarCurr)| -v717(VarCurr).
% 94.91/94.27  0 [] v717(VarCurr)|v686(VarCurr).
% 94.91/94.27  0 [] -v717(VarCurr)| -v686(VarCurr).
% 94.91/94.27  0 [] v716(VarCurr)|v684(VarCurr).
% 94.91/94.27  0 [] -v716(VarCurr)| -v684(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v704(VarNext)|v705(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v704(VarNext)| -v705(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v705(VarNext)|v706(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v705(VarNext)|v699(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v705(VarNext)| -v706(VarNext)| -v699(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v706(VarNext)|v708(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v706(VarNext)| -v708(VarNext).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)| -v708(VarNext)|v699(VarCurr).
% 94.91/94.27  0 [] -nextState(VarCurr,VarNext)|v708(VarNext)| -v699(VarCurr).
% 94.91/94.27  0 [] -v682(constB0)|$F.
% 94.91/94.27  0 [] v682(constB0)| -$F.
% 94.91/94.27  0 [] -v699(VarCurr)|v701(VarCurr).
% 94.91/94.27  0 [] v699(VarCurr)| -v701(VarCurr).
% 94.91/94.27  0 [] -v701(VarCurr)|v288(VarCurr).
% 94.91/94.27  0 [] v701(VarCurr)| -v288(VarCurr).
% 94.91/94.27  0 [] -v688(VarCurr)|v690(VarCurr).
% 94.91/94.27  0 [] v688(VarCurr)| -v690(VarCurr).
% 94.91/94.27  0 [] -v690(VarCurr)|v692(VarCurr).
% 94.91/94.27  0 [] v690(VarCurr)| -v692(VarCurr).
% 94.91/94.27  0 [] -v692(VarCurr)|v694(VarCurr).
% 94.91/94.27  0 [] v692(VarCurr)| -v694(VarCurr).
% 94.91/94.27  0 [] -v694(VarCurr)|v696(VarCurr).
% 94.91/94.27  0 [] v694(VarCurr)| -v696(VarCurr).
% 94.91/94.27  0 [] -v686(VarCurr)|v676(VarCurr).
% 94.91/94.27  0 [] v686(VarCurr)| -v676(VarCurr).
% 94.91/94.27  0 [] -v684(VarCurr)|v672(VarCurr).
% 94.91/94.27  0 [] v684(VarCurr)| -v672(VarCurr).
% 94.91/94.27  0 [] -v674(VarCurr)|v676(VarCurr).
% 94.91/94.27  0 [] v674(VarCurr)| -v676(VarCurr).
% 94.91/94.27  0 [] -v676(VarCurr)|$F.
% 94.91/94.27  0 [] v676(VarCurr)| -$F.
% 94.91/94.27  0 [] -v670(VarCurr)|v672(VarCurr).
% 94.91/94.27  0 [] v670(VarCurr)| -v672(VarCurr).
% 94.93/94.27  0 [] -v672(VarCurr)|$F.
% 94.93/94.27  0 [] v672(VarCurr)| -$F.
% 94.93/94.27  0 [] -v320(VarCurr)|v322(VarCurr).
% 94.93/94.27  0 [] v320(VarCurr)| -v322(VarCurr).
% 94.93/94.27  0 [] -v322(VarCurr)|v324(VarCurr).
% 94.93/94.27  0 [] v322(VarCurr)| -v324(VarCurr).
% 94.93/94.27  0 [] -v324(VarCurr)|v326(VarCurr).
% 94.93/94.27  0 [] v324(VarCurr)| -v326(VarCurr).
% 94.93/94.27  0 [] -v326(VarCurr)|v328(VarCurr).
% 94.93/94.27  0 [] v326(VarCurr)| -v328(VarCurr).
% 94.93/94.27  0 [] -v328(VarCurr)|v330(VarCurr).
% 94.93/94.27  0 [] v328(VarCurr)| -v330(VarCurr).
% 94.93/94.27  0 [] -v330(VarCurr)|v332(VarCurr).
% 94.93/94.27  0 [] v330(VarCurr)| -v332(VarCurr).
% 94.93/94.27  0 [] v657(VarCurr)| -v332(VarCurr)|$F.
% 94.93/94.27  0 [] v657(VarCurr)|v332(VarCurr)| -$F.
% 94.93/94.27  0 [] -v657(VarCurr)| -v332(VarCurr)|v658(VarCurr).
% 94.93/94.27  0 [] -v657(VarCurr)|v332(VarCurr)| -v658(VarCurr).
% 94.93/94.27  0 [] v510(VarCurr)| -v658(VarCurr)|v661(VarCurr).
% 94.93/94.27  0 [] v510(VarCurr)|v658(VarCurr)| -v661(VarCurr).
% 94.93/94.27  0 [] -v510(VarCurr)| -v658(VarCurr)|v659(VarCurr).
% 94.93/94.27  0 [] -v510(VarCurr)|v658(VarCurr)| -v659(VarCurr).
% 94.93/94.27  0 [] v513(VarCurr)| -v661(VarCurr)|v662(VarCurr).
% 94.93/94.27  0 [] v513(VarCurr)|v661(VarCurr)| -v662(VarCurr).
% 94.93/94.27  0 [] -v513(VarCurr)| -v661(VarCurr)|$T.
% 94.93/94.27  0 [] -v513(VarCurr)|v661(VarCurr)| -$T.
% 94.93/94.27  0 [] v517(VarCurr)| -v662(VarCurr)|$F.
% 94.93/94.27  0 [] v517(VarCurr)|v662(VarCurr)| -$F.
% 94.93/94.27  0 [] -v517(VarCurr)| -v662(VarCurr)|$F.
% 94.93/94.27  0 [] -v517(VarCurr)|v662(VarCurr)| -$F.
% 94.93/94.27  0 [] v509(VarCurr)| -v659(VarCurr)|v660(VarCurr).
% 94.93/94.27  0 [] v509(VarCurr)|v659(VarCurr)| -v660(VarCurr).
% 94.93/94.27  0 [] -v509(VarCurr)| -v659(VarCurr)|$F.
% 94.93/94.27  0 [] -v509(VarCurr)|v659(VarCurr)| -$F.
% 94.93/94.27  0 [] v539(VarCurr)| -v660(VarCurr)|$F.
% 94.93/94.27  0 [] v539(VarCurr)|v660(VarCurr)| -$F.
% 94.93/94.27  0 [] -v539(VarCurr)| -v660(VarCurr)|$T.
% 94.93/94.27  0 [] -v539(VarCurr)|v660(VarCurr)| -$T.
% 94.93/94.27  0 [] -v657(VarCurr)|v510(VarCurr)|v514(VarCurr).
% 94.93/94.27  0 [] v657(VarCurr)| -v510(VarCurr).
% 94.93/94.27  0 [] v657(VarCurr)| -v514(VarCurr).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v645(VarNext)| -v334(VarNext,bitIndex0)|v334(VarCurr,bitIndex0).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v645(VarNext)|v334(VarNext,bitIndex0)| -v334(VarCurr,bitIndex0).
% 94.93/94.27  0 [] -v645(VarNext)| -v334(VarNext,bitIndex0)|v653(VarNext).
% 94.93/94.27  0 [] -v645(VarNext)|v334(VarNext,bitIndex0)| -v653(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v653(VarNext)|v651(VarCurr).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v653(VarNext)| -v651(VarCurr).
% 94.93/94.27  0 [] v531(VarCurr)| -v651(VarCurr)|v342(VarCurr,bitIndex0).
% 94.93/94.27  0 [] v531(VarCurr)|v651(VarCurr)| -v342(VarCurr,bitIndex0).
% 94.93/94.27  0 [] -v531(VarCurr)| -v651(VarCurr)|$T.
% 94.93/94.27  0 [] -v531(VarCurr)|v651(VarCurr)| -$T.
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v645(VarNext)|v646(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v645(VarNext)| -v646(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v646(VarNext)|v648(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v646(VarNext)|v484(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v646(VarNext)| -v648(VarNext)| -v484(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v648(VarNext)|v524(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v648(VarNext)| -v524(VarNext).
% 94.93/94.27  0 [] v637(VarCurr)| -v342(VarCurr,bitIndex0)|$F.
% 94.93/94.27  0 [] v637(VarCurr)|v342(VarCurr,bitIndex0)| -$F.
% 94.93/94.27  0 [] -v637(VarCurr)| -v342(VarCurr,bitIndex0)|v641(VarCurr).
% 94.93/94.27  0 [] -v637(VarCurr)|v342(VarCurr,bitIndex0)| -v641(VarCurr).
% 94.93/94.27  0 [] v638(VarCurr)| -v641(VarCurr)|$T.
% 94.93/94.27  0 [] v638(VarCurr)|v641(VarCurr)| -$T.
% 94.93/94.27  0 [] -v638(VarCurr)| -v641(VarCurr)|v642(VarCurr).
% 94.93/94.27  0 [] -v638(VarCurr)|v641(VarCurr)| -v642(VarCurr).
% 94.93/94.27  0 [] v539(VarCurr)| -v642(VarCurr)|$T.
% 94.93/94.27  0 [] v539(VarCurr)|v642(VarCurr)| -$T.
% 94.93/94.27  0 [] -v539(VarCurr)| -v642(VarCurr)|$T.
% 94.93/94.27  0 [] -v539(VarCurr)|v642(VarCurr)| -$T.
% 94.93/94.27  0 [] -v637(VarCurr)|v638(VarCurr)|v640(VarCurr).
% 94.93/94.27  0 [] v637(VarCurr)| -v638(VarCurr).
% 94.93/94.27  0 [] v637(VarCurr)| -v640(VarCurr).
% 94.93/94.27  0 [] -v640(VarCurr)|v513(VarCurr).
% 94.93/94.27  0 [] -v640(VarCurr)|v514(VarCurr).
% 94.93/94.27  0 [] v640(VarCurr)| -v513(VarCurr)| -v514(VarCurr).
% 94.93/94.27  0 [] -v638(VarCurr)|v639(VarCurr).
% 94.93/94.27  0 [] -v638(VarCurr)|v510(VarCurr).
% 94.93/94.27  0 [] v638(VarCurr)| -v639(VarCurr)| -v510(VarCurr).
% 94.93/94.27  0 [] v639(VarCurr)|v509(VarCurr).
% 94.93/94.27  0 [] -v639(VarCurr)| -v509(VarCurr).
% 94.93/94.27  0 [] -v344(VarCurr)|v346(VarCurr).
% 94.93/94.27  0 [] v344(VarCurr)| -v346(VarCurr).
% 94.93/94.27  0 [] -v346(VarCurr)| -v348(VarCurr,bitIndex4)|$F.
% 94.93/94.27  0 [] -v346(VarCurr)|v348(VarCurr,bitIndex4)| -$F.
% 94.93/94.27  0 [] -v346(VarCurr)| -v348(VarCurr,bitIndex3)|$F.
% 94.93/94.27  0 [] -v346(VarCurr)|v348(VarCurr,bitIndex3)| -$F.
% 94.93/94.27  0 [] -v346(VarCurr)| -v348(VarCurr,bitIndex2)|$F.
% 94.93/94.27  0 [] -v346(VarCurr)|v348(VarCurr,bitIndex2)| -$F.
% 94.93/94.27  0 [] -v346(VarCurr)| -v348(VarCurr,bitIndex1)|$F.
% 94.93/94.27  0 [] -v346(VarCurr)|v348(VarCurr,bitIndex1)| -$F.
% 94.93/94.27  0 [] -v346(VarCurr)| -v348(VarCurr,bitIndex0)|$F.
% 94.93/94.27  0 [] -v346(VarCurr)|v348(VarCurr,bitIndex0)| -$F.
% 94.93/94.27  0 [] v346(VarCurr)|v348(VarCurr,bitIndex4)|$F|v348(VarCurr,bitIndex3)|v348(VarCurr,bitIndex2)|v348(VarCurr,bitIndex1)|v348(VarCurr,bitIndex0).
% 94.93/94.27  0 [] v346(VarCurr)| -v348(VarCurr,bitIndex4)| -$F| -v348(VarCurr,bitIndex3)| -v348(VarCurr,bitIndex2)| -v348(VarCurr,bitIndex1)| -v348(VarCurr,bitIndex0).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v621(VarNext)| -range_4_0(B)| -v348(VarNext,B)|v348(VarCurr,B).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v621(VarNext)| -range_4_0(B)|v348(VarNext,B)| -v348(VarCurr,B).
% 94.93/94.27  0 [] -v621(VarNext)| -range_4_0(B)| -v348(VarNext,B)|v631(VarNext,B).
% 94.93/94.27  0 [] -v621(VarNext)| -range_4_0(B)|v348(VarNext,B)| -v631(VarNext,B).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v631(VarNext,B)|v629(VarCurr,B).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v631(VarNext,B)| -v629(VarCurr,B).
% 94.93/94.27  0 [] v632(VarCurr)| -range_4_0(B)| -v629(VarCurr,B)|v352(VarCurr,B).
% 94.93/94.27  0 [] v632(VarCurr)| -range_4_0(B)|v629(VarCurr,B)| -v352(VarCurr,B).
% 94.93/94.27  0 [] -v632(VarCurr)| -range_4_0(B)| -v629(VarCurr,B)|$F.
% 94.93/94.27  0 [] -v632(VarCurr)| -range_4_0(B)|v629(VarCurr,B)| -$F.
% 94.93/94.27  0 [] v632(VarCurr)|v350(VarCurr).
% 94.93/94.27  0 [] -v632(VarCurr)| -v350(VarCurr).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v621(VarNext)|v622(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v621(VarNext)| -v622(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v622(VarNext)|v623(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v622(VarNext)|v618(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v622(VarNext)| -v623(VarNext)| -v618(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v623(VarNext)|v625(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v623(VarNext)| -v625(VarNext).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)| -v625(VarNext)|v618(VarCurr).
% 94.93/94.27  0 [] -nextState(VarCurr,VarNext)|v625(VarNext)| -v618(VarCurr).
% 94.93/94.27  0 [] -v618(VarCurr)|v484(VarCurr).
% 94.93/94.27  0 [] v618(VarCurr)| -v484(VarCurr).
% 94.93/94.27  0 [] v543(VarCurr)|v545(VarCurr)|v586(VarCurr)| -range_4_0(B)| -v352(VarCurr,B)|v348(VarCurr,B).
% 94.93/94.27  0 [] v543(VarCurr)|v545(VarCurr)|v586(VarCurr)| -range_4_0(B)|v352(VarCurr,B)| -v348(VarCurr,B).
% 94.93/94.27  0 [] -v586(VarCurr)| -range_4_0(B)| -v352(VarCurr,B)|v588(VarCurr,B).
% 94.93/94.27  0 [] -v586(VarCurr)| -range_4_0(B)|v352(VarCurr,B)| -v588(VarCurr,B).
% 94.93/94.27  0 [] -v545(VarCurr)| -range_4_0(B)| -v352(VarCurr,B)|v547(VarCurr,B).
% 94.93/94.27  0 [] -v545(VarCurr)| -range_4_0(B)|v352(VarCurr,B)| -v547(VarCurr,B).
% 94.93/94.27  0 [] -v543(VarCurr)| -range_4_0(B)| -v352(VarCurr,B)|v348(VarCurr,B).
% 94.93/94.27  0 [] -v543(VarCurr)| -range_4_0(B)|v352(VarCurr,B)| -v348(VarCurr,B).
% 94.93/94.27  0 [] -v615(VarCurr)| -v616(VarCurr,bitIndex1)|$T.
% 94.93/94.27  0 [] -v615(VarCurr)|v616(VarCurr,bitIndex1)| -$T.
% 94.93/94.27  0 [] -v615(VarCurr)| -v616(VarCurr,bitIndex0)|$T.
% 94.93/94.27  0 [] -v615(VarCurr)|v616(VarCurr,bitIndex0)| -$T.
% 94.93/94.27  0 [] v615(VarCurr)|v616(VarCurr,bitIndex1)|$T|v616(VarCurr,bitIndex0).
% 94.93/94.27  0 [] v615(VarCurr)| -v616(VarCurr,bitIndex1)| -$T| -v616(VarCurr,bitIndex0).
% 94.93/94.27  0 [] -v616(VarCurr,bitIndex0)|v377(VarCurr).
% 94.93/94.27  0 [] v616(VarCurr,bitIndex0)| -v377(VarCurr).
% 94.93/94.27  0 [] -v616(VarCurr,bitIndex1)|v354(VarCurr).
% 94.93/94.27  0 [] v616(VarCurr,bitIndex1)| -v354(VarCurr).
% 94.93/94.27  0 [] v589(VarCurr)| -range_4_0(B)| -v588(VarCurr,B)|v590(VarCurr,B).
% 94.93/94.27  0 [] v589(VarCurr)| -range_4_0(B)|v588(VarCurr,B)| -v590(VarCurr,B).
% 94.93/94.27  0 [] -v589(VarCurr)| -range_4_0(B)| -v588(VarCurr,B)|b01111(B).
% 94.93/94.27  0 [] -v589(VarCurr)| -range_4_0(B)|v588(VarCurr,B)| -b01111(B).
% 94.93/94.27  0 [] -v590(VarCurr,bitIndex0)|v612(VarCurr).
% 94.93/94.27  0 [] v590(VarCurr,bitIndex0)| -v612(VarCurr).
% 94.93/94.27  0 [] -v590(VarCurr,bitIndex1)|v610(VarCurr).
% 94.93/94.27  0 [] v590(VarCurr,bitIndex1)| -v610(VarCurr).
% 94.93/94.27  0 [] -v590(VarCurr,bitIndex2)|v605(VarCurr).
% 94.93/94.27  0 [] v590(VarCurr,bitIndex2)| -v605(VarCurr).
% 94.93/94.27  0 [] -v590(VarCurr,bitIndex3)|v600(VarCurr).
% 94.93/94.27  0 [] v590(VarCurr,bitIndex3)| -v600(VarCurr).
% 94.93/94.27  0 [] -v590(VarCurr,bitIndex4)|v592(VarCurr).
% 94.93/94.27  0 [] v590(VarCurr,bitIndex4)| -v592(VarCurr).
% 94.93/94.27  0 [] -v610(VarCurr)|v611(VarCurr).
% 94.93/94.27  0 [] -v610(VarCurr)|v614(VarCurr).
% 94.93/94.27  0 [] v610(VarCurr)| -v611(VarCurr)| -v614(VarCurr).
% 94.93/94.27  0 [] -v614(VarCurr)|v348(VarCurr,bitIndex0)|v348(VarCurr,bitIndex1).
% 94.93/94.27  0 [] v614(VarCurr)| -v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] v614(VarCurr)| -v348(VarCurr,bitIndex1).
% 94.93/94.28  0 [] -v611(VarCurr)|v612(VarCurr)|v613(VarCurr).
% 94.93/94.28  0 [] v611(VarCurr)| -v612(VarCurr).
% 94.93/94.28  0 [] v611(VarCurr)| -v613(VarCurr).
% 94.93/94.28  0 [] v613(VarCurr)|v348(VarCurr,bitIndex1).
% 94.93/94.28  0 [] -v613(VarCurr)| -v348(VarCurr,bitIndex1).
% 94.93/94.28  0 [] v612(VarCurr)|v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] -v612(VarCurr)| -v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] -v605(VarCurr)|v606(VarCurr).
% 94.93/94.28  0 [] -v605(VarCurr)|v609(VarCurr).
% 94.93/94.28  0 [] v605(VarCurr)| -v606(VarCurr)| -v609(VarCurr).
% 94.93/94.28  0 [] -v609(VarCurr)|v597(VarCurr)|v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] v609(VarCurr)| -v597(VarCurr).
% 94.93/94.28  0 [] v609(VarCurr)| -v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] -v606(VarCurr)|v607(VarCurr)|v608(VarCurr).
% 94.93/94.28  0 [] v606(VarCurr)| -v607(VarCurr).
% 94.93/94.28  0 [] v606(VarCurr)| -v608(VarCurr).
% 94.93/94.28  0 [] v608(VarCurr)|v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] -v608(VarCurr)| -v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] v607(VarCurr)|v597(VarCurr).
% 94.93/94.28  0 [] -v607(VarCurr)| -v597(VarCurr).
% 94.93/94.28  0 [] -v600(VarCurr)|v601(VarCurr).
% 94.93/94.28  0 [] -v600(VarCurr)|v604(VarCurr).
% 94.93/94.28  0 [] v600(VarCurr)| -v601(VarCurr)| -v604(VarCurr).
% 94.93/94.28  0 [] -v604(VarCurr)|v596(VarCurr)|v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] v604(VarCurr)| -v596(VarCurr).
% 94.93/94.28  0 [] v604(VarCurr)| -v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] -v601(VarCurr)|v602(VarCurr)|v603(VarCurr).
% 94.93/94.28  0 [] v601(VarCurr)| -v602(VarCurr).
% 94.93/94.28  0 [] v601(VarCurr)| -v603(VarCurr).
% 94.93/94.28  0 [] v603(VarCurr)|v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] -v603(VarCurr)| -v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] v602(VarCurr)|v596(VarCurr).
% 94.93/94.28  0 [] -v602(VarCurr)| -v596(VarCurr).
% 94.93/94.28  0 [] -v592(VarCurr)|v593(VarCurr).
% 94.93/94.28  0 [] -v592(VarCurr)|v599(VarCurr).
% 94.93/94.28  0 [] v592(VarCurr)| -v593(VarCurr)| -v599(VarCurr).
% 94.93/94.28  0 [] -v599(VarCurr)|v595(VarCurr)|v348(VarCurr,bitIndex4).
% 94.93/94.28  0 [] v599(VarCurr)| -v595(VarCurr).
% 94.93/94.28  0 [] v599(VarCurr)| -v348(VarCurr,bitIndex4).
% 94.93/94.28  0 [] -v593(VarCurr)|v594(VarCurr)|v598(VarCurr).
% 94.93/94.28  0 [] v593(VarCurr)| -v594(VarCurr).
% 94.93/94.28  0 [] v593(VarCurr)| -v598(VarCurr).
% 94.93/94.28  0 [] v598(VarCurr)|v348(VarCurr,bitIndex4).
% 94.93/94.28  0 [] -v598(VarCurr)| -v348(VarCurr,bitIndex4).
% 94.93/94.28  0 [] v594(VarCurr)|v595(VarCurr).
% 94.93/94.28  0 [] -v594(VarCurr)| -v595(VarCurr).
% 94.93/94.28  0 [] -v595(VarCurr)|v596(VarCurr).
% 94.93/94.28  0 [] -v595(VarCurr)|v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] v595(VarCurr)| -v596(VarCurr)| -v348(VarCurr,bitIndex3).
% 94.93/94.28  0 [] -v596(VarCurr)|v597(VarCurr).
% 94.93/94.28  0 [] -v596(VarCurr)|v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] v596(VarCurr)| -v597(VarCurr)| -v348(VarCurr,bitIndex2).
% 94.93/94.28  0 [] -v597(VarCurr)|v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] -v597(VarCurr)|v348(VarCurr,bitIndex1).
% 94.93/94.28  0 [] v597(VarCurr)| -v348(VarCurr,bitIndex0)| -v348(VarCurr,bitIndex1).
% 94.93/94.28  0 [] -v589(VarCurr)| -v348(VarCurr,bitIndex4)|$F.
% 94.93/94.28  0 [] -v589(VarCurr)|v348(VarCurr,bitIndex4)| -$F.
% 94.93/94.28  0 [] -v589(VarCurr)| -v348(VarCurr,bitIndex3)|$T.
% 94.93/94.28  0 [] -v589(VarCurr)|v348(VarCurr,bitIndex3)| -$T.
% 94.93/94.28  0 [] -v589(VarCurr)| -v348(VarCurr,bitIndex2)|$T.
% 94.93/94.28  0 [] -v589(VarCurr)|v348(VarCurr,bitIndex2)| -$T.
% 94.93/94.28  0 [] -v589(VarCurr)| -v348(VarCurr,bitIndex1)|$T.
% 94.93/94.28  0 [] -v589(VarCurr)|v348(VarCurr,bitIndex1)| -$T.
% 94.93/94.28  0 [] -v589(VarCurr)| -v348(VarCurr,bitIndex0)|$T.
% 94.93/94.28  0 [] -v589(VarCurr)|v348(VarCurr,bitIndex0)| -$T.
% 94.93/94.28  0 [] v589(VarCurr)|v348(VarCurr,bitIndex4)|$F|v348(VarCurr,bitIndex3)|$T|v348(VarCurr,bitIndex2)|v348(VarCurr,bitIndex1)|v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] v589(VarCurr)|v348(VarCurr,bitIndex4)|$F| -v348(VarCurr,bitIndex3)| -$T| -v348(VarCurr,bitIndex2)| -v348(VarCurr,bitIndex1)| -v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] v589(VarCurr)| -v348(VarCurr,bitIndex4)| -$F|v348(VarCurr,bitIndex3)|$T|v348(VarCurr,bitIndex2)|v348(VarCurr,bitIndex1)|v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] v589(VarCurr)| -v348(VarCurr,bitIndex4)| -$F| -v348(VarCurr,bitIndex3)| -$T| -v348(VarCurr,bitIndex2)| -v348(VarCurr,bitIndex1)| -v348(VarCurr,bitIndex0).
% 94.93/94.28  0 [] -v586(VarCurr)| -v587(VarCurr,bitIndex1)|$T.
% 94.93/94.28  0 [] -v586(VarCurr)|v587(VarCurr,bitIndex1)| -$T.
% 94.93/94.28  0 [] -v586(VarCurr)| -v587(VarCurr,bitIndex0)|$F.
% 94.93/94.28  0 [] -v586(VarCurr)|v587(VarCurr,bitIndex0)| -$F.
% 94.93/94.28  0 [] v586(VarCurr)|v587(VarCurr,bitIndex1)|$T|v587(VarCurr,bitIndex0)|$F.
% 94.93/94.28  0 [] v586(VarCurr)|v587(VarCurr,bitIndex1)|$T| -v587(VarCurr,bitIndex0)| -$F.
% 94.93/94.28  0 [] v586(VarCurr)| -v587(VarCurr,bitIndex1)| -$T|v587(VarCurr,bitIndex0)|$F.
% 94.93/94.28  0 [] v586(VarCurr)| -v587(VarCurr,bitIndex1)| -$T| -v587(VarCurr,bitIndex0)| -$F.
% 94.93/94.28  0 [] -v587(VarCurr,bitIndex0)|v377(VarCurr).
% 94.93/94.28  0 [] v587(VarCurr,bitIndex0)| -v377(VarCurr).
% 94.93/94.28  0 [] -v587(VarCurr,bitIndex1)|v354(VarCurr).
% 94.93/94.28  0 [] v587(VarCurr,bitIndex1)| -v354(VarCurr).
% 94.93/94.28  0 [] v548(VarCurr)| -range_31_0(B)| -v547(VarCurr,B)|v549(VarCurr,B).
% 94.93/94.28  0 [] v548(VarCurr)| -range_31_0(B)|v547(VarCurr,B)| -v549(VarCurr,B).
% 94.93/94.28  0 [] -v548(VarCurr)| -range_31_0(B)| -v547(VarCurr,B)|$F.
% 94.93/94.28  0 [] -v548(VarCurr)| -range_31_0(B)|v547(VarCurr,B)| -$F.
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex6)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex6)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex7)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex7)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex8)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex8)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex9)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex9)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex10)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex10)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex11)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex11)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex12)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex12)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex13)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex13)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex14)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex14)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex15)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex15)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex16)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex16)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex17)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex17)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex18)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex18)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex19)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex19)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex20)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex20)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex21)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex21)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex22)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex22)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex23)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex23)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex24)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex24)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex25)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex25)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex26)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex26)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex27)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex27)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex28)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex28)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex29)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex29)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex30)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex30)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -v549(VarCurr,bitIndex31)|v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] v549(VarCurr,bitIndex31)| -v550(VarCurr,bitIndex5).
% 94.93/94.28  0 [] -range_5_0(B)| -v549(VarCurr,B)|v550(VarCurr,B).
% 94.93/94.28  0 [] -range_5_0(B)|v549(VarCurr,B)| -v550(VarCurr,B).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex0)|v584(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex0)| -v584(VarCurr).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex1)|v582(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex1)| -v582(VarCurr).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex2)|v578(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex2)| -v578(VarCurr).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex3)|v574(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex3)| -v574(VarCurr).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex4)|v570(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex4)| -v570(VarCurr).
% 94.93/94.28  0 [] -v550(VarCurr,bitIndex5)|v552(VarCurr).
% 94.93/94.28  0 [] v550(VarCurr,bitIndex5)| -v552(VarCurr).
% 94.93/94.28  0 [] -v582(VarCurr)|v583(VarCurr).
% 94.93/94.28  0 [] -v582(VarCurr)|v585(VarCurr).
% 94.93/94.28  0 [] v582(VarCurr)| -v583(VarCurr)| -v585(VarCurr).
% 94.93/94.29  0 [] -v585(VarCurr)|v556(VarCurr,bitIndex0)|v564(VarCurr).
% 94.93/94.29  0 [] v585(VarCurr)| -v556(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v585(VarCurr)| -v564(VarCurr).
% 94.93/94.29  0 [] -v583(VarCurr)|v584(VarCurr)|v556(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v583(VarCurr)| -v584(VarCurr).
% 94.93/94.29  0 [] v583(VarCurr)| -v556(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v584(VarCurr)|v556(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v584(VarCurr)| -v556(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v578(VarCurr)|v579(VarCurr).
% 94.93/94.29  0 [] -v578(VarCurr)|v581(VarCurr).
% 94.93/94.29  0 [] v578(VarCurr)| -v579(VarCurr)| -v581(VarCurr).
% 94.93/94.29  0 [] -v581(VarCurr)|v562(VarCurr)|v565(VarCurr).
% 94.93/94.29  0 [] v581(VarCurr)| -v562(VarCurr).
% 94.93/94.29  0 [] v581(VarCurr)| -v565(VarCurr).
% 94.93/94.29  0 [] -v579(VarCurr)|v580(VarCurr)|v556(VarCurr,bitIndex2).
% 94.93/94.29  0 [] v579(VarCurr)| -v580(VarCurr).
% 94.93/94.29  0 [] v579(VarCurr)| -v556(VarCurr,bitIndex2).
% 94.93/94.29  0 [] v580(VarCurr)|v562(VarCurr).
% 94.93/94.29  0 [] -v580(VarCurr)| -v562(VarCurr).
% 94.93/94.29  0 [] -v574(VarCurr)|v575(VarCurr).
% 94.93/94.29  0 [] -v574(VarCurr)|v577(VarCurr).
% 94.93/94.29  0 [] v574(VarCurr)| -v575(VarCurr)| -v577(VarCurr).
% 94.93/94.29  0 [] -v577(VarCurr)|v560(VarCurr)|v566(VarCurr).
% 94.93/94.29  0 [] v577(VarCurr)| -v560(VarCurr).
% 94.93/94.29  0 [] v577(VarCurr)| -v566(VarCurr).
% 94.93/94.29  0 [] -v575(VarCurr)|v576(VarCurr)|v556(VarCurr,bitIndex3).
% 94.93/94.29  0 [] v575(VarCurr)| -v576(VarCurr).
% 94.93/94.29  0 [] v575(VarCurr)| -v556(VarCurr,bitIndex3).
% 94.93/94.29  0 [] v576(VarCurr)|v560(VarCurr).
% 94.93/94.29  0 [] -v576(VarCurr)| -v560(VarCurr).
% 94.93/94.29  0 [] -v570(VarCurr)|v571(VarCurr).
% 94.93/94.29  0 [] -v570(VarCurr)|v573(VarCurr).
% 94.93/94.29  0 [] v570(VarCurr)| -v571(VarCurr)| -v573(VarCurr).
% 94.93/94.29  0 [] -v573(VarCurr)|v558(VarCurr)|v567(VarCurr).
% 94.93/94.29  0 [] v573(VarCurr)| -v558(VarCurr).
% 94.93/94.29  0 [] v573(VarCurr)| -v567(VarCurr).
% 94.93/94.29  0 [] -v571(VarCurr)|v572(VarCurr)|v556(VarCurr,bitIndex4).
% 94.93/94.29  0 [] v571(VarCurr)| -v572(VarCurr).
% 94.93/94.29  0 [] v571(VarCurr)| -v556(VarCurr,bitIndex4).
% 94.93/94.29  0 [] v572(VarCurr)|v558(VarCurr).
% 94.93/94.29  0 [] -v572(VarCurr)| -v558(VarCurr).
% 94.93/94.29  0 [] -v552(VarCurr)|v553(VarCurr).
% 94.93/94.29  0 [] -v552(VarCurr)|v568(VarCurr).
% 94.93/94.29  0 [] v552(VarCurr)| -v553(VarCurr)| -v568(VarCurr).
% 94.93/94.29  0 [] -v568(VarCurr)|v555(VarCurr)|v569(VarCurr).
% 94.93/94.29  0 [] v568(VarCurr)| -v555(VarCurr).
% 94.93/94.29  0 [] v568(VarCurr)| -v569(VarCurr).
% 94.93/94.29  0 [] v569(VarCurr)|v556(VarCurr,bitIndex5).
% 94.93/94.29  0 [] -v569(VarCurr)| -v556(VarCurr,bitIndex5).
% 94.93/94.29  0 [] -v553(VarCurr)|v554(VarCurr)|v556(VarCurr,bitIndex5).
% 94.93/94.29  0 [] v553(VarCurr)| -v554(VarCurr).
% 94.93/94.29  0 [] v553(VarCurr)| -v556(VarCurr,bitIndex5).
% 94.93/94.29  0 [] v554(VarCurr)|v555(VarCurr).
% 94.93/94.29  0 [] -v554(VarCurr)| -v555(VarCurr).
% 94.93/94.29  0 [] -v555(VarCurr)|v556(VarCurr,bitIndex4)|v557(VarCurr).
% 94.93/94.29  0 [] v555(VarCurr)| -v556(VarCurr,bitIndex4).
% 94.93/94.29  0 [] v555(VarCurr)| -v557(VarCurr).
% 94.93/94.29  0 [] -v557(VarCurr)|v558(VarCurr).
% 94.93/94.29  0 [] -v557(VarCurr)|v567(VarCurr).
% 94.93/94.29  0 [] v557(VarCurr)| -v558(VarCurr)| -v567(VarCurr).
% 94.93/94.29  0 [] v567(VarCurr)|v556(VarCurr,bitIndex4).
% 94.93/94.29  0 [] -v567(VarCurr)| -v556(VarCurr,bitIndex4).
% 94.93/94.29  0 [] -v558(VarCurr)|v556(VarCurr,bitIndex3)|v559(VarCurr).
% 94.93/94.29  0 [] v558(VarCurr)| -v556(VarCurr,bitIndex3).
% 94.93/94.29  0 [] v558(VarCurr)| -v559(VarCurr).
% 94.93/94.29  0 [] -v559(VarCurr)|v560(VarCurr).
% 94.93/94.29  0 [] -v559(VarCurr)|v566(VarCurr).
% 94.93/94.29  0 [] v559(VarCurr)| -v560(VarCurr)| -v566(VarCurr).
% 94.93/94.29  0 [] v566(VarCurr)|v556(VarCurr,bitIndex3).
% 94.93/94.29  0 [] -v566(VarCurr)| -v556(VarCurr,bitIndex3).
% 94.93/94.29  0 [] -v560(VarCurr)|v556(VarCurr,bitIndex2)|v561(VarCurr).
% 94.93/94.29  0 [] v560(VarCurr)| -v556(VarCurr,bitIndex2).
% 94.93/94.29  0 [] v560(VarCurr)| -v561(VarCurr).
% 94.93/94.29  0 [] -v561(VarCurr)|v562(VarCurr).
% 94.93/94.29  0 [] -v561(VarCurr)|v565(VarCurr).
% 94.93/94.29  0 [] v561(VarCurr)| -v562(VarCurr)| -v565(VarCurr).
% 94.93/94.29  0 [] v565(VarCurr)|v556(VarCurr,bitIndex2).
% 94.93/94.29  0 [] -v565(VarCurr)| -v556(VarCurr,bitIndex2).
% 94.93/94.29  0 [] -v562(VarCurr)|v556(VarCurr,bitIndex1)|v563(VarCurr).
% 94.93/94.29  0 [] v562(VarCurr)| -v556(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v562(VarCurr)| -v563(VarCurr).
% 94.93/94.29  0 [] -v563(VarCurr)|v556(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v563(VarCurr)|v564(VarCurr).
% 94.93/94.29  0 [] v563(VarCurr)| -v556(VarCurr,bitIndex0)| -v564(VarCurr).
% 94.93/94.29  0 [] v564(VarCurr)|v556(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -v564(VarCurr)| -v556(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -v556(VarCurr,bitIndex5).
% 94.93/94.29  0 [] -range_4_0(B)| -v556(VarCurr,B)|v348(VarCurr,B).
% 94.93/94.29  0 [] -range_4_0(B)|v556(VarCurr,B)| -v348(VarCurr,B).
% 94.93/94.29  0 [] -v548(VarCurr)| -v348(VarCurr,bitIndex4)|$F.
% 94.93/94.29  0 [] -v548(VarCurr)|v348(VarCurr,bitIndex4)| -$F.
% 94.93/94.29  0 [] -v548(VarCurr)| -v348(VarCurr,bitIndex3)|$F.
% 94.93/94.29  0 [] -v548(VarCurr)|v348(VarCurr,bitIndex3)| -$F.
% 94.93/94.29  0 [] -v548(VarCurr)| -v348(VarCurr,bitIndex2)|$F.
% 94.93/94.29  0 [] -v548(VarCurr)|v348(VarCurr,bitIndex2)| -$F.
% 94.93/94.29  0 [] -v548(VarCurr)| -v348(VarCurr,bitIndex1)|$F.
% 94.93/94.29  0 [] -v548(VarCurr)|v348(VarCurr,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v548(VarCurr)| -v348(VarCurr,bitIndex0)|$F.
% 94.93/94.29  0 [] -v548(VarCurr)|v348(VarCurr,bitIndex0)| -$F.
% 94.93/94.29  0 [] v548(VarCurr)|v348(VarCurr,bitIndex4)|$F|v348(VarCurr,bitIndex3)|v348(VarCurr,bitIndex2)|v348(VarCurr,bitIndex1)|v348(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v548(VarCurr)| -v348(VarCurr,bitIndex4)| -$F| -v348(VarCurr,bitIndex3)| -v348(VarCurr,bitIndex2)| -v348(VarCurr,bitIndex1)| -v348(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v545(VarCurr)| -v546(VarCurr,bitIndex1)|$F.
% 94.93/94.29  0 [] -v545(VarCurr)|v546(VarCurr,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v545(VarCurr)| -v546(VarCurr,bitIndex0)|$T.
% 94.93/94.29  0 [] -v545(VarCurr)|v546(VarCurr,bitIndex0)| -$T.
% 94.93/94.29  0 [] v545(VarCurr)|v546(VarCurr,bitIndex1)|$F|v546(VarCurr,bitIndex0)|$T.
% 94.93/94.29  0 [] v545(VarCurr)|v546(VarCurr,bitIndex1)|$F| -v546(VarCurr,bitIndex0)| -$T.
% 94.93/94.29  0 [] v545(VarCurr)| -v546(VarCurr,bitIndex1)| -$F|v546(VarCurr,bitIndex0)|$T.
% 94.93/94.29  0 [] v545(VarCurr)| -v546(VarCurr,bitIndex1)| -$F| -v546(VarCurr,bitIndex0)| -$T.
% 94.93/94.29  0 [] -v546(VarCurr,bitIndex0)|v377(VarCurr).
% 94.93/94.29  0 [] v546(VarCurr,bitIndex0)| -v377(VarCurr).
% 94.93/94.29  0 [] -v546(VarCurr,bitIndex1)|v354(VarCurr).
% 94.93/94.29  0 [] v546(VarCurr,bitIndex1)| -v354(VarCurr).
% 94.93/94.29  0 [] -range_4_0(B)| -v348(constB0,B)|$F.
% 94.93/94.29  0 [] -range_4_0(B)|v348(constB0,B)| -$F.
% 94.93/94.29  0 [] -v543(VarCurr)| -v544(VarCurr,bitIndex1)|$F.
% 94.93/94.29  0 [] -v543(VarCurr)|v544(VarCurr,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v543(VarCurr)| -v544(VarCurr,bitIndex0)|$F.
% 94.93/94.29  0 [] -v543(VarCurr)|v544(VarCurr,bitIndex0)| -$F.
% 94.93/94.29  0 [] v543(VarCurr)|v544(VarCurr,bitIndex1)|$F|v544(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v543(VarCurr)| -v544(VarCurr,bitIndex1)| -$F| -v544(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v544(VarCurr,bitIndex0)|v377(VarCurr).
% 94.93/94.29  0 [] v544(VarCurr,bitIndex0)| -v377(VarCurr).
% 94.93/94.29  0 [] -v544(VarCurr,bitIndex1)|v354(VarCurr).
% 94.93/94.29  0 [] v544(VarCurr,bitIndex1)| -v354(VarCurr).
% 94.93/94.29  0 [] -v377(VarCurr)|v379(VarCurr).
% 94.93/94.29  0 [] v377(VarCurr)| -v379(VarCurr).
% 94.93/94.29  0 [] v535(VarCurr)| -v379(VarCurr)|$F.
% 94.93/94.29  0 [] v535(VarCurr)|v379(VarCurr)| -$F.
% 94.93/94.29  0 [] -v535(VarCurr)| -v379(VarCurr)|v536(VarCurr).
% 94.93/94.29  0 [] -v535(VarCurr)|v379(VarCurr)| -v536(VarCurr).
% 94.93/94.29  0 [] v510(VarCurr)| -v536(VarCurr)|v540(VarCurr).
% 94.93/94.29  0 [] v510(VarCurr)|v536(VarCurr)| -v540(VarCurr).
% 94.93/94.29  0 [] -v510(VarCurr)| -v536(VarCurr)|v537(VarCurr).
% 94.93/94.29  0 [] -v510(VarCurr)|v536(VarCurr)| -v537(VarCurr).
% 94.93/94.29  0 [] v513(VarCurr)| -v540(VarCurr)|v541(VarCurr).
% 94.93/94.29  0 [] v513(VarCurr)|v540(VarCurr)| -v541(VarCurr).
% 94.93/94.29  0 [] -v513(VarCurr)| -v540(VarCurr)|$F.
% 94.93/94.29  0 [] -v513(VarCurr)|v540(VarCurr)| -$F.
% 94.93/94.29  0 [] v517(VarCurr)| -v541(VarCurr)|$F.
% 94.93/94.29  0 [] v517(VarCurr)|v541(VarCurr)| -$F.
% 94.93/94.29  0 [] -v517(VarCurr)| -v541(VarCurr)|$T.
% 94.93/94.29  0 [] -v517(VarCurr)|v541(VarCurr)| -$T.
% 94.93/94.29  0 [] v509(VarCurr)| -v537(VarCurr)|v538(VarCurr).
% 94.93/94.29  0 [] v509(VarCurr)|v537(VarCurr)| -v538(VarCurr).
% 94.93/94.29  0 [] -v509(VarCurr)| -v537(VarCurr)|$T.
% 94.93/94.29  0 [] -v509(VarCurr)|v537(VarCurr)| -$T.
% 94.93/94.29  0 [] v539(VarCurr)| -v538(VarCurr)|$F.
% 94.93/94.29  0 [] v539(VarCurr)|v538(VarCurr)| -$F.
% 94.93/94.29  0 [] -v539(VarCurr)| -v538(VarCurr)|$F.
% 94.93/94.29  0 [] -v539(VarCurr)|v538(VarCurr)| -$F.
% 94.93/94.29  0 [] v539(VarCurr)|v381(VarCurr).
% 94.93/94.29  0 [] -v539(VarCurr)| -v381(VarCurr).
% 94.93/94.29  0 [] -v535(VarCurr)|v510(VarCurr)|v514(VarCurr).
% 94.93/94.29  0 [] v535(VarCurr)| -v510(VarCurr).
% 94.93/94.29  0 [] v535(VarCurr)| -v514(VarCurr).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v520(VarNext)| -v334(VarNext,bitIndex1)|v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v520(VarNext)|v334(VarNext,bitIndex1)| -v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -v520(VarNext)| -v334(VarNext,bitIndex1)|v530(VarNext).
% 94.93/94.29  0 [] -v520(VarNext)|v334(VarNext,bitIndex1)| -v530(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v530(VarNext)|v528(VarCurr).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v530(VarNext)| -v528(VarCurr).
% 94.93/94.29  0 [] v531(VarCurr)| -v528(VarCurr)|v342(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v531(VarCurr)|v528(VarCurr)| -v342(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -v531(VarCurr)| -v528(VarCurr)|$F.
% 94.93/94.29  0 [] -v531(VarCurr)|v528(VarCurr)| -$F.
% 94.93/94.29  0 [] v531(VarCurr)|v336(VarCurr).
% 94.93/94.29  0 [] -v531(VarCurr)| -v336(VarCurr).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v520(VarNext)|v521(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v520(VarNext)| -v521(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v521(VarNext)|v522(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v521(VarNext)|v484(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v521(VarNext)| -v522(VarNext)| -v484(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v522(VarNext)|v524(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v522(VarNext)| -v524(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v524(VarNext)|v484(VarCurr).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v524(VarNext)| -v484(VarCurr).
% 94.93/94.29  0 [] v507(VarCurr)| -v342(VarCurr,bitIndex1)|$F.
% 94.93/94.29  0 [] v507(VarCurr)|v342(VarCurr,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v507(VarCurr)| -v342(VarCurr,bitIndex1)|v515(VarCurr).
% 94.93/94.29  0 [] -v507(VarCurr)|v342(VarCurr,bitIndex1)| -v515(VarCurr).
% 94.93/94.29  0 [] v508(VarCurr)| -v515(VarCurr)|v516(VarCurr).
% 94.93/94.29  0 [] v508(VarCurr)|v515(VarCurr)| -v516(VarCurr).
% 94.93/94.29  0 [] -v508(VarCurr)| -v515(VarCurr)|$T.
% 94.93/94.29  0 [] -v508(VarCurr)|v515(VarCurr)| -$T.
% 94.93/94.29  0 [] v517(VarCurr)| -v516(VarCurr)|$T.
% 94.93/94.29  0 [] v517(VarCurr)|v516(VarCurr)| -$T.
% 94.93/94.29  0 [] -v517(VarCurr)| -v516(VarCurr)|$T.
% 94.93/94.29  0 [] -v517(VarCurr)|v516(VarCurr)| -$T.
% 94.93/94.29  0 [] v517(VarCurr)|v344(VarCurr).
% 94.93/94.29  0 [] -v517(VarCurr)| -v344(VarCurr).
% 94.93/94.29  0 [] -v507(VarCurr)|v508(VarCurr)|v511(VarCurr).
% 94.93/94.29  0 [] v507(VarCurr)| -v508(VarCurr).
% 94.93/94.29  0 [] v507(VarCurr)| -v511(VarCurr).
% 94.93/94.29  0 [] -v511(VarCurr)|v512(VarCurr).
% 94.93/94.29  0 [] -v511(VarCurr)|v514(VarCurr).
% 94.93/94.29  0 [] v511(VarCurr)| -v512(VarCurr)| -v514(VarCurr).
% 94.93/94.29  0 [] -v514(VarCurr)| -$T|v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] -v514(VarCurr)|$T| -v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v514(VarCurr)|$T|v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v514(VarCurr)| -$T| -v334(VarCurr,bitIndex1).
% 94.93/94.29  0 [] v512(VarCurr)|v513(VarCurr).
% 94.93/94.29  0 [] -v512(VarCurr)| -v513(VarCurr).
% 94.93/94.29  0 [] v513(VarCurr)|v381(VarCurr).
% 94.93/94.29  0 [] -v513(VarCurr)| -v381(VarCurr).
% 94.93/94.29  0 [] -v508(VarCurr)|v509(VarCurr).
% 94.93/94.29  0 [] -v508(VarCurr)|v510(VarCurr).
% 94.93/94.29  0 [] v508(VarCurr)| -v509(VarCurr)| -v510(VarCurr).
% 94.93/94.29  0 [] -v510(VarCurr)| -$T|v334(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v510(VarCurr)|$T| -v334(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v510(VarCurr)|$T|v334(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v510(VarCurr)| -$T| -v334(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -v334(constB0,bitIndex1)|$F.
% 94.93/94.29  0 [] v334(constB0,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v334(constB0,bitIndex0)|$T.
% 94.93/94.29  0 [] v334(constB0,bitIndex0)| -$T.
% 94.93/94.29  0 [] v509(VarCurr)|v344(VarCurr).
% 94.93/94.29  0 [] -v509(VarCurr)| -v344(VarCurr).
% 94.93/94.29  0 [] -v381(VarCurr)|v383(VarCurr).
% 94.93/94.29  0 [] v381(VarCurr)| -v383(VarCurr).
% 94.93/94.29  0 [] -v383(VarCurr)| -v385(VarCurr,bitIndex4)|$F.
% 94.93/94.29  0 [] -v383(VarCurr)|v385(VarCurr,bitIndex4)| -$F.
% 94.93/94.29  0 [] -v383(VarCurr)| -v385(VarCurr,bitIndex3)|$F.
% 94.93/94.29  0 [] -v383(VarCurr)|v385(VarCurr,bitIndex3)| -$F.
% 94.93/94.29  0 [] -v383(VarCurr)| -v385(VarCurr,bitIndex2)|$F.
% 94.93/94.29  0 [] -v383(VarCurr)|v385(VarCurr,bitIndex2)| -$F.
% 94.93/94.29  0 [] -v383(VarCurr)| -v385(VarCurr,bitIndex1)|$F.
% 94.93/94.29  0 [] -v383(VarCurr)|v385(VarCurr,bitIndex1)| -$F.
% 94.93/94.29  0 [] -v383(VarCurr)| -v385(VarCurr,bitIndex0)|$F.
% 94.93/94.29  0 [] -v383(VarCurr)|v385(VarCurr,bitIndex0)| -$F.
% 94.93/94.29  0 [] v383(VarCurr)|v385(VarCurr,bitIndex4)|$F|v385(VarCurr,bitIndex3)|v385(VarCurr,bitIndex2)|v385(VarCurr,bitIndex1)|v385(VarCurr,bitIndex0).
% 94.93/94.29  0 [] v383(VarCurr)| -v385(VarCurr,bitIndex4)| -$F| -v385(VarCurr,bitIndex3)| -v385(VarCurr,bitIndex2)| -v385(VarCurr,bitIndex1)| -v385(VarCurr,bitIndex0).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v491(VarNext)| -range_4_0(B)| -v385(VarNext,B)|v385(VarCurr,B).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v491(VarNext)| -range_4_0(B)|v385(VarNext,B)| -v385(VarCurr,B).
% 94.93/94.29  0 [] -v491(VarNext)| -range_4_0(B)| -v385(VarNext,B)|v501(VarNext,B).
% 94.93/94.29  0 [] -v491(VarNext)| -range_4_0(B)|v385(VarNext,B)| -v501(VarNext,B).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)| -v501(VarNext,B)|v499(VarCurr,B).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -range_4_0(B)|v501(VarNext,B)| -v499(VarCurr,B).
% 94.93/94.29  0 [] v502(VarCurr)| -range_4_0(B)| -v499(VarCurr,B)|v389(VarCurr,B).
% 94.93/94.29  0 [] v502(VarCurr)| -range_4_0(B)|v499(VarCurr,B)| -v389(VarCurr,B).
% 94.93/94.29  0 [] -v502(VarCurr)| -range_4_0(B)| -v499(VarCurr,B)|$F.
% 94.93/94.29  0 [] -v502(VarCurr)| -range_4_0(B)|v499(VarCurr,B)| -$F.
% 94.93/94.29  0 [] v502(VarCurr)|v387(VarCurr).
% 94.93/94.29  0 [] -v502(VarCurr)| -v387(VarCurr).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v491(VarNext)|v492(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)|v491(VarNext)| -v492(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v492(VarNext)|v493(VarNext).
% 94.93/94.29  0 [] -nextState(VarCurr,VarNext)| -v492(VarNext)|v482(VarNext).
% 94.93/94.30  0 [] -nextState(VarCurr,VarNext)|v492(VarNext)| -v493(VarNext)| -v482(VarNext).
% 94.93/94.30  0 [] -nextState(VarCurr,VarNext)|v493(VarNext)|v495(VarNext).
% 94.93/94.30  0 [] -nextState(VarCurr,VarNext)| -v493(VarNext)| -v495(VarNext).
% 94.93/94.30  0 [] -nextState(VarCurr,VarNext)| -v495(VarNext)|v482(VarCurr).
% 94.93/94.30  0 [] -nextState(VarCurr,VarNext)|v495(VarNext)| -v482(VarCurr).
% 94.93/94.30  0 [] -v482(VarCurr)|v484(VarCurr).
% 94.93/94.30  0 [] v482(VarCurr)| -v484(VarCurr).
% 94.93/94.30  0 [] -v484(VarCurr)|v486(VarCurr).
% 94.93/94.30  0 [] v484(VarCurr)| -v486(VarCurr).
% 94.93/94.30  0 [] -v486(VarCurr)|v488(VarCurr).
% 94.93/94.30  0 [] v486(VarCurr)| -v488(VarCurr).
% 94.93/94.30  0 [] -v488(VarCurr)|v1(VarCurr).
% 94.93/94.30  0 [] v488(VarCurr)| -v1(VarCurr).
% 94.93/94.30  0 [] v407(VarCurr)|v409(VarCurr)|v450(VarCurr)| -range_4_0(B)| -v389(VarCurr,B)|v385(VarCurr,B).
% 94.93/94.30  0 [] v407(VarCurr)|v409(VarCurr)|v450(VarCurr)| -range_4_0(B)|v389(VarCurr,B)| -v385(VarCurr,B).
% 94.93/94.30  0 [] -v450(VarCurr)| -range_4_0(B)| -v389(VarCurr,B)|v452(VarCurr,B).
% 94.93/94.30  0 [] -v450(VarCurr)| -range_4_0(B)|v389(VarCurr,B)| -v452(VarCurr,B).
% 94.93/94.30  0 [] -v409(VarCurr)| -range_4_0(B)| -v389(VarCurr,B)|v411(VarCurr,B).
% 94.93/94.30  0 [] -v409(VarCurr)| -range_4_0(B)|v389(VarCurr,B)| -v411(VarCurr,B).
% 94.93/94.30  0 [] -v407(VarCurr)| -range_4_0(B)| -v389(VarCurr,B)|v385(VarCurr,B).
% 94.93/94.30  0 [] -v407(VarCurr)| -range_4_0(B)|v389(VarCurr,B)| -v385(VarCurr,B).
% 94.93/94.30  0 [] -v479(VarCurr)| -v480(VarCurr,bitIndex1)|$T.
% 94.93/94.30  0 [] -v479(VarCurr)|v480(VarCurr,bitIndex1)| -$T.
% 94.93/94.30  0 [] -v479(VarCurr)| -v480(VarCurr,bitIndex0)|$T.
% 94.93/94.30  0 [] -v479(VarCurr)|v480(VarCurr,bitIndex0)| -$T.
% 94.93/94.30  0 [] v479(VarCurr)|v480(VarCurr,bitIndex1)|$T|v480(VarCurr,bitIndex0).
% 94.93/94.30  0 [] v479(VarCurr)| -v480(VarCurr,bitIndex1)| -$T| -v480(VarCurr,bitIndex0).
% 94.93/94.30  0 [] -v480(VarCurr,bitIndex0)|v403(VarCurr).
% 94.93/94.30  0 [] v480(VarCurr,bitIndex0)| -v403(VarCurr).
% 94.93/94.30  0 [] -v480(VarCurr,bitIndex1)|v391(VarCurr).
% 94.93/94.30  0 [] v480(VarCurr,bitIndex1)| -v391(VarCurr).
% 94.93/94.30  0 [] v453(VarCurr)| -range_4_0(B)| -v452(VarCurr,B)|v454(VarCurr,B).
% 94.93/94.30  0 [] v453(VarCurr)| -range_4_0(B)|v452(VarCurr,B)| -v454(VarCurr,B).
% 94.93/94.30  0 [] -v453(VarCurr)| -range_4_0(B)| -v452(VarCurr,B)|b01111(B).
% 94.93/94.30  0 [] -v453(VarCurr)| -range_4_0(B)|v452(VarCurr,B)| -b01111(B).
% 94.93/94.30  0 [] -v454(VarCurr,bitIndex0)|v476(VarCurr).
% 94.93/94.30  0 [] v454(VarCurr,bitIndex0)| -v476(VarCurr).
% 94.93/94.30  0 [] -v454(VarCurr,bitIndex1)|v474(VarCurr).
% 94.93/94.30  0 [] v454(VarCurr,bitIndex1)| -v474(VarCurr).
% 94.93/94.30  0 [] -v454(VarCurr,bitIndex2)|v469(VarCurr).
% 94.93/94.30  0 [] v454(VarCurr,bitIndex2)| -v469(VarCurr).
% 94.93/94.30  0 [] -v454(VarCurr,bitIndex3)|v464(VarCurr).
% 94.93/94.30  0 [] v454(VarCurr,bitIndex3)| -v464(VarCurr).
% 94.93/94.30  0 [] -v454(VarCurr,bitIndex4)|v456(VarCurr).
% 94.93/94.30  0 [] v454(VarCurr,bitIndex4)| -v456(VarCurr).
% 94.93/94.30  0 [] -v474(VarCurr)|v475(VarCurr).
% 94.93/94.30  0 [] -v474(VarCurr)|v478(VarCurr).
% 94.93/94.30  0 [] v474(VarCurr)| -v475(VarCurr)| -v478(VarCurr).
% 94.93/94.30  0 [] -v478(VarCurr)|v385(VarCurr,bitIndex0)|v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] v478(VarCurr)| -v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] v478(VarCurr)| -v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] -v475(VarCurr)|v476(VarCurr)|v477(VarCurr).
% 94.93/94.30  0 [] v475(VarCurr)| -v476(VarCurr).
% 94.93/94.30  0 [] v475(VarCurr)| -v477(VarCurr).
% 94.93/94.30  0 [] v477(VarCurr)|v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] -v477(VarCurr)| -v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] v476(VarCurr)|v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] -v476(VarCurr)| -v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] -v469(VarCurr)|v470(VarCurr).
% 94.93/94.30  0 [] -v469(VarCurr)|v473(VarCurr).
% 94.93/94.30  0 [] v469(VarCurr)| -v470(VarCurr)| -v473(VarCurr).
% 94.93/94.30  0 [] -v473(VarCurr)|v461(VarCurr)|v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] v473(VarCurr)| -v461(VarCurr).
% 94.93/94.30  0 [] v473(VarCurr)| -v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] -v470(VarCurr)|v471(VarCurr)|v472(VarCurr).
% 94.93/94.30  0 [] v470(VarCurr)| -v471(VarCurr).
% 94.93/94.30  0 [] v470(VarCurr)| -v472(VarCurr).
% 94.93/94.30  0 [] v472(VarCurr)|v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] -v472(VarCurr)| -v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] v471(VarCurr)|v461(VarCurr).
% 94.93/94.30  0 [] -v471(VarCurr)| -v461(VarCurr).
% 94.93/94.30  0 [] -v464(VarCurr)|v465(VarCurr).
% 94.93/94.30  0 [] -v464(VarCurr)|v468(VarCurr).
% 94.93/94.30  0 [] v464(VarCurr)| -v465(VarCurr)| -v468(VarCurr).
% 94.93/94.30  0 [] -v468(VarCurr)|v460(VarCurr)|v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] v468(VarCurr)| -v460(VarCurr).
% 94.93/94.30  0 [] v468(VarCurr)| -v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] -v465(VarCurr)|v466(VarCurr)|v467(VarCurr).
% 94.93/94.30  0 [] v465(VarCurr)| -v466(VarCurr).
% 94.93/94.30  0 [] v465(VarCurr)| -v467(VarCurr).
% 94.93/94.30  0 [] v467(VarCurr)|v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] -v467(VarCurr)| -v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] v466(VarCurr)|v460(VarCurr).
% 94.93/94.30  0 [] -v466(VarCurr)| -v460(VarCurr).
% 94.93/94.30  0 [] -v456(VarCurr)|v457(VarCurr).
% 94.93/94.30  0 [] -v456(VarCurr)|v463(VarCurr).
% 94.93/94.30  0 [] v456(VarCurr)| -v457(VarCurr)| -v463(VarCurr).
% 94.93/94.30  0 [] -v463(VarCurr)|v459(VarCurr)|v385(VarCurr,bitIndex4).
% 94.93/94.30  0 [] v463(VarCurr)| -v459(VarCurr).
% 94.93/94.30  0 [] v463(VarCurr)| -v385(VarCurr,bitIndex4).
% 94.93/94.30  0 [] -v457(VarCurr)|v458(VarCurr)|v462(VarCurr).
% 94.93/94.30  0 [] v457(VarCurr)| -v458(VarCurr).
% 94.93/94.30  0 [] v457(VarCurr)| -v462(VarCurr).
% 94.93/94.30  0 [] v462(VarCurr)|v385(VarCurr,bitIndex4).
% 94.93/94.30  0 [] -v462(VarCurr)| -v385(VarCurr,bitIndex4).
% 94.93/94.30  0 [] v458(VarCurr)|v459(VarCurr).
% 94.93/94.30  0 [] -v458(VarCurr)| -v459(VarCurr).
% 94.93/94.30  0 [] -v459(VarCurr)|v460(VarCurr).
% 94.93/94.30  0 [] -v459(VarCurr)|v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] v459(VarCurr)| -v460(VarCurr)| -v385(VarCurr,bitIndex3).
% 94.93/94.30  0 [] -v460(VarCurr)|v461(VarCurr).
% 94.93/94.30  0 [] -v460(VarCurr)|v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] v460(VarCurr)| -v461(VarCurr)| -v385(VarCurr,bitIndex2).
% 94.93/94.30  0 [] -v461(VarCurr)|v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] -v461(VarCurr)|v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] v461(VarCurr)| -v385(VarCurr,bitIndex0)| -v385(VarCurr,bitIndex1).
% 94.93/94.30  0 [] -v453(VarCurr)| -v385(VarCurr,bitIndex4)|$F.
% 94.93/94.30  0 [] -v453(VarCurr)|v385(VarCurr,bitIndex4)| -$F.
% 94.93/94.30  0 [] -v453(VarCurr)| -v385(VarCurr,bitIndex3)|$T.
% 94.93/94.30  0 [] -v453(VarCurr)|v385(VarCurr,bitIndex3)| -$T.
% 94.93/94.30  0 [] -v453(VarCurr)| -v385(VarCurr,bitIndex2)|$T.
% 94.93/94.30  0 [] -v453(VarCurr)|v385(VarCurr,bitIndex2)| -$T.
% 94.93/94.30  0 [] -v453(VarCurr)| -v385(VarCurr,bitIndex1)|$T.
% 94.93/94.30  0 [] -v453(VarCurr)|v385(VarCurr,bitIndex1)| -$T.
% 94.93/94.30  0 [] -v453(VarCurr)| -v385(VarCurr,bitIndex0)|$T.
% 94.93/94.30  0 [] -v453(VarCurr)|v385(VarCurr,bitIndex0)| -$T.
% 94.93/94.30  0 [] v453(VarCurr)|v385(VarCurr,bitIndex4)|$F|v385(VarCurr,bitIndex3)|$T|v385(VarCurr,bitIndex2)|v385(VarCurr,bitIndex1)|v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] v453(VarCurr)|v385(VarCurr,bitIndex4)|$F| -v385(VarCurr,bitIndex3)| -$T| -v385(VarCurr,bitIndex2)| -v385(VarCurr,bitIndex1)| -v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] v453(VarCurr)| -v385(VarCurr,bitIndex4)| -$F|v385(VarCurr,bitIndex3)|$T|v385(VarCurr,bitIndex2)|v385(VarCurr,bitIndex1)|v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] v453(VarCurr)| -v385(VarCurr,bitIndex4)| -$F| -v385(VarCurr,bitIndex3)| -$T| -v385(VarCurr,bitIndex2)| -v385(VarCurr,bitIndex1)| -v385(VarCurr,bitIndex0).
% 94.93/94.30  0 [] -b01111(bitIndex4).
% 94.93/94.30  0 [] b01111(bitIndex3).
% 94.93/94.30  0 [] b01111(bitIndex2).
% 94.93/94.30  0 [] b01111(bitIndex1).
% 94.93/94.30  0 [] b01111(bitIndex0).
% 94.93/94.30  0 [] -v450(VarCurr)| -v451(VarCurr,bitIndex1)|$T.
% 94.93/94.30  0 [] -v450(VarCurr)|v451(VarCurr,bitIndex1)| -$T.
% 94.93/94.30  0 [] -v450(VarCurr)| -v451(VarCurr,bitIndex0)|$F.
% 94.93/94.30  0 [] -v450(VarCurr)|v451(VarCurr,bitIndex0)| -$F.
% 94.93/94.30  0 [] v450(VarCurr)|v451(VarCurr,bitIndex1)|$T|v451(VarCurr,bitIndex0)|$F.
% 94.93/94.30  0 [] v450(VarCurr)|v451(VarCurr,bitIndex1)|$T| -v451(VarCurr,bitIndex0)| -$F.
% 94.93/94.30  0 [] v450(VarCurr)| -v451(VarCurr,bitIndex1)| -$T|v451(VarCurr,bitIndex0)|$F.
% 94.93/94.30  0 [] v450(VarCurr)| -v451(VarCurr,bitIndex1)| -$T| -v451(VarCurr,bitIndex0)| -$F.
% 94.93/94.30  0 [] -v451(VarCurr,bitIndex0)|v403(VarCurr).
% 94.93/94.30  0 [] v451(VarCurr,bitIndex0)| -v403(VarCurr).
% 94.93/94.30  0 [] -v451(VarCurr,bitIndex1)|v391(VarCurr).
% 94.93/94.30  0 [] v451(VarCurr,bitIndex1)| -v391(VarCurr).
% 94.93/94.30  0 [] v412(VarCurr)| -range_31_0(B)| -v411(VarCurr,B)|v413(VarCurr,B).
% 94.93/94.30  0 [] v412(VarCurr)| -range_31_0(B)|v411(VarCurr,B)| -v413(VarCurr,B).
% 94.93/94.30  0 [] -v412(VarCurr)| -range_31_0(B)| -v411(VarCurr,B)|$F.
% 94.93/94.30  0 [] -v412(VarCurr)| -range_31_0(B)|v411(VarCurr,B)| -$F.
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex6)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex6)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex7)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex7)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex8)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex8)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex9)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex9)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex10)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex10)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex11)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex11)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex12)|v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] v413(VarCurr,bitIndex12)| -v414(VarCurr,bitIndex5).
% 94.93/94.30  0 [] -v413(VarCurr,bitIndex13)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex13)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex14)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex14)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex15)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex15)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex16)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex16)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex17)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex17)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex18)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex18)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex19)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex19)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex20)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex20)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex21)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex21)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex22)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex22)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex23)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex23)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex24)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex24)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex25)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex25)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex26)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex26)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex27)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex27)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex28)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex28)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex29)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex29)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex30)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex30)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v413(VarCurr,bitIndex31)|v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v413(VarCurr,bitIndex31)| -v414(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -range_5_0(B)| -v413(VarCurr,B)|v414(VarCurr,B).
% 94.93/94.31  0 [] -range_5_0(B)|v413(VarCurr,B)| -v414(VarCurr,B).
% 94.93/94.31  0 [] -range_5_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex0!=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex1!=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex2!=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex3!=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex4!=B.
% 94.93/94.31  0 [] range_5_0(B)|bitIndex5!=B.
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex0)|v448(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex0)| -v448(VarCurr).
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex1)|v446(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex1)| -v446(VarCurr).
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex2)|v442(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex2)| -v442(VarCurr).
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex3)|v438(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex3)| -v438(VarCurr).
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex4)|v434(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex4)| -v434(VarCurr).
% 94.93/94.31  0 [] -v414(VarCurr,bitIndex5)|v416(VarCurr).
% 94.93/94.31  0 [] v414(VarCurr,bitIndex5)| -v416(VarCurr).
% 94.93/94.31  0 [] -v446(VarCurr)|v447(VarCurr).
% 94.93/94.31  0 [] -v446(VarCurr)|v449(VarCurr).
% 94.93/94.31  0 [] v446(VarCurr)| -v447(VarCurr)| -v449(VarCurr).
% 94.93/94.31  0 [] -v449(VarCurr)|v420(VarCurr,bitIndex0)|v428(VarCurr).
% 94.93/94.31  0 [] v449(VarCurr)| -v420(VarCurr,bitIndex0).
% 94.93/94.31  0 [] v449(VarCurr)| -v428(VarCurr).
% 94.93/94.31  0 [] -v447(VarCurr)|v448(VarCurr)|v420(VarCurr,bitIndex1).
% 94.93/94.31  0 [] v447(VarCurr)| -v448(VarCurr).
% 94.93/94.31  0 [] v447(VarCurr)| -v420(VarCurr,bitIndex1).
% 94.93/94.31  0 [] v448(VarCurr)|v420(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -v448(VarCurr)| -v420(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -v442(VarCurr)|v443(VarCurr).
% 94.93/94.31  0 [] -v442(VarCurr)|v445(VarCurr).
% 94.93/94.31  0 [] v442(VarCurr)| -v443(VarCurr)| -v445(VarCurr).
% 94.93/94.31  0 [] -v445(VarCurr)|v426(VarCurr)|v429(VarCurr).
% 94.93/94.31  0 [] v445(VarCurr)| -v426(VarCurr).
% 94.93/94.31  0 [] v445(VarCurr)| -v429(VarCurr).
% 94.93/94.31  0 [] -v443(VarCurr)|v444(VarCurr)|v420(VarCurr,bitIndex2).
% 94.93/94.31  0 [] v443(VarCurr)| -v444(VarCurr).
% 94.93/94.31  0 [] v443(VarCurr)| -v420(VarCurr,bitIndex2).
% 94.93/94.31  0 [] v444(VarCurr)|v426(VarCurr).
% 94.93/94.31  0 [] -v444(VarCurr)| -v426(VarCurr).
% 94.93/94.31  0 [] -v438(VarCurr)|v439(VarCurr).
% 94.93/94.31  0 [] -v438(VarCurr)|v441(VarCurr).
% 94.93/94.31  0 [] v438(VarCurr)| -v439(VarCurr)| -v441(VarCurr).
% 94.93/94.31  0 [] -v441(VarCurr)|v424(VarCurr)|v430(VarCurr).
% 94.93/94.31  0 [] v441(VarCurr)| -v424(VarCurr).
% 94.93/94.31  0 [] v441(VarCurr)| -v430(VarCurr).
% 94.93/94.31  0 [] -v439(VarCurr)|v440(VarCurr)|v420(VarCurr,bitIndex3).
% 94.93/94.31  0 [] v439(VarCurr)| -v440(VarCurr).
% 94.93/94.31  0 [] v439(VarCurr)| -v420(VarCurr,bitIndex3).
% 94.93/94.31  0 [] v440(VarCurr)|v424(VarCurr).
% 94.93/94.31  0 [] -v440(VarCurr)| -v424(VarCurr).
% 94.93/94.31  0 [] -v434(VarCurr)|v435(VarCurr).
% 94.93/94.31  0 [] -v434(VarCurr)|v437(VarCurr).
% 94.93/94.31  0 [] v434(VarCurr)| -v435(VarCurr)| -v437(VarCurr).
% 94.93/94.31  0 [] -v437(VarCurr)|v422(VarCurr)|v431(VarCurr).
% 94.93/94.31  0 [] v437(VarCurr)| -v422(VarCurr).
% 94.93/94.31  0 [] v437(VarCurr)| -v431(VarCurr).
% 94.93/94.31  0 [] -v435(VarCurr)|v436(VarCurr)|v420(VarCurr,bitIndex4).
% 94.93/94.31  0 [] v435(VarCurr)| -v436(VarCurr).
% 94.93/94.31  0 [] v435(VarCurr)| -v420(VarCurr,bitIndex4).
% 94.93/94.31  0 [] v436(VarCurr)|v422(VarCurr).
% 94.93/94.31  0 [] -v436(VarCurr)| -v422(VarCurr).
% 94.93/94.31  0 [] -v416(VarCurr)|v417(VarCurr).
% 94.93/94.31  0 [] -v416(VarCurr)|v432(VarCurr).
% 94.93/94.31  0 [] v416(VarCurr)| -v417(VarCurr)| -v432(VarCurr).
% 94.93/94.31  0 [] -v432(VarCurr)|v419(VarCurr)|v433(VarCurr).
% 94.93/94.31  0 [] v432(VarCurr)| -v419(VarCurr).
% 94.93/94.31  0 [] v432(VarCurr)| -v433(VarCurr).
% 94.93/94.31  0 [] v433(VarCurr)|v420(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v433(VarCurr)| -v420(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -v417(VarCurr)|v418(VarCurr)|v420(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v417(VarCurr)| -v418(VarCurr).
% 94.93/94.31  0 [] v417(VarCurr)| -v420(VarCurr,bitIndex5).
% 94.93/94.31  0 [] v418(VarCurr)|v419(VarCurr).
% 94.93/94.31  0 [] -v418(VarCurr)| -v419(VarCurr).
% 94.93/94.31  0 [] -v419(VarCurr)|v420(VarCurr,bitIndex4)|v421(VarCurr).
% 94.93/94.31  0 [] v419(VarCurr)| -v420(VarCurr,bitIndex4).
% 94.93/94.31  0 [] v419(VarCurr)| -v421(VarCurr).
% 94.93/94.31  0 [] -v421(VarCurr)|v422(VarCurr).
% 94.93/94.31  0 [] -v421(VarCurr)|v431(VarCurr).
% 94.93/94.31  0 [] v421(VarCurr)| -v422(VarCurr)| -v431(VarCurr).
% 94.93/94.31  0 [] v431(VarCurr)|v420(VarCurr,bitIndex4).
% 94.93/94.31  0 [] -v431(VarCurr)| -v420(VarCurr,bitIndex4).
% 94.93/94.31  0 [] -v422(VarCurr)|v420(VarCurr,bitIndex3)|v423(VarCurr).
% 94.93/94.31  0 [] v422(VarCurr)| -v420(VarCurr,bitIndex3).
% 94.93/94.31  0 [] v422(VarCurr)| -v423(VarCurr).
% 94.93/94.31  0 [] -v423(VarCurr)|v424(VarCurr).
% 94.93/94.31  0 [] -v423(VarCurr)|v430(VarCurr).
% 94.93/94.31  0 [] v423(VarCurr)| -v424(VarCurr)| -v430(VarCurr).
% 94.93/94.31  0 [] v430(VarCurr)|v420(VarCurr,bitIndex3).
% 94.93/94.31  0 [] -v430(VarCurr)| -v420(VarCurr,bitIndex3).
% 94.93/94.31  0 [] -v424(VarCurr)|v420(VarCurr,bitIndex2)|v425(VarCurr).
% 94.93/94.31  0 [] v424(VarCurr)| -v420(VarCurr,bitIndex2).
% 94.93/94.31  0 [] v424(VarCurr)| -v425(VarCurr).
% 94.93/94.31  0 [] -v425(VarCurr)|v426(VarCurr).
% 94.93/94.31  0 [] -v425(VarCurr)|v429(VarCurr).
% 94.93/94.31  0 [] v425(VarCurr)| -v426(VarCurr)| -v429(VarCurr).
% 94.93/94.31  0 [] v429(VarCurr)|v420(VarCurr,bitIndex2).
% 94.93/94.31  0 [] -v429(VarCurr)| -v420(VarCurr,bitIndex2).
% 94.93/94.31  0 [] -v426(VarCurr)|v420(VarCurr,bitIndex1)|v427(VarCurr).
% 94.93/94.31  0 [] v426(VarCurr)| -v420(VarCurr,bitIndex1).
% 94.93/94.31  0 [] v426(VarCurr)| -v427(VarCurr).
% 94.93/94.31  0 [] -v427(VarCurr)|v420(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -v427(VarCurr)|v428(VarCurr).
% 94.93/94.31  0 [] v427(VarCurr)| -v420(VarCurr,bitIndex0)| -v428(VarCurr).
% 94.93/94.31  0 [] v428(VarCurr)|v420(VarCurr,bitIndex1).
% 94.93/94.31  0 [] -v428(VarCurr)| -v420(VarCurr,bitIndex1).
% 94.93/94.31  0 [] -v420(VarCurr,bitIndex5).
% 94.93/94.31  0 [] -range_4_0(B)| -v420(VarCurr,B)|v385(VarCurr,B).
% 94.93/94.31  0 [] -range_4_0(B)|v420(VarCurr,B)| -v385(VarCurr,B).
% 94.93/94.31  0 [] -v412(VarCurr)| -v385(VarCurr,bitIndex4)|$F.
% 94.93/94.31  0 [] -v412(VarCurr)|v385(VarCurr,bitIndex4)| -$F.
% 94.93/94.31  0 [] -v412(VarCurr)| -v385(VarCurr,bitIndex3)|$F.
% 94.93/94.31  0 [] -v412(VarCurr)|v385(VarCurr,bitIndex3)| -$F.
% 94.93/94.31  0 [] -v412(VarCurr)| -v385(VarCurr,bitIndex2)|$F.
% 94.93/94.31  0 [] -v412(VarCurr)|v385(VarCurr,bitIndex2)| -$F.
% 94.93/94.31  0 [] -v412(VarCurr)| -v385(VarCurr,bitIndex1)|$F.
% 94.93/94.31  0 [] -v412(VarCurr)|v385(VarCurr,bitIndex1)| -$F.
% 94.93/94.31  0 [] -v412(VarCurr)| -v385(VarCurr,bitIndex0)|$F.
% 94.93/94.31  0 [] -v412(VarCurr)|v385(VarCurr,bitIndex0)| -$F.
% 94.93/94.31  0 [] v412(VarCurr)|v385(VarCurr,bitIndex4)|$F|v385(VarCurr,bitIndex3)|v385(VarCurr,bitIndex2)|v385(VarCurr,bitIndex1)|v385(VarCurr,bitIndex0).
% 94.93/94.31  0 [] v412(VarCurr)| -v385(VarCurr,bitIndex4)| -$F| -v385(VarCurr,bitIndex3)| -v385(VarCurr,bitIndex2)| -v385(VarCurr,bitIndex1)| -v385(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -v409(VarCurr)| -v410(VarCurr,bitIndex1)|$F.
% 94.93/94.31  0 [] -v409(VarCurr)|v410(VarCurr,bitIndex1)| -$F.
% 94.93/94.31  0 [] -v409(VarCurr)| -v410(VarCurr,bitIndex0)|$T.
% 94.93/94.31  0 [] -v409(VarCurr)|v410(VarCurr,bitIndex0)| -$T.
% 94.93/94.31  0 [] v409(VarCurr)|v410(VarCurr,bitIndex1)|$F|v410(VarCurr,bitIndex0)|$T.
% 94.93/94.31  0 [] v409(VarCurr)|v410(VarCurr,bitIndex1)|$F| -v410(VarCurr,bitIndex0)| -$T.
% 94.93/94.31  0 [] v409(VarCurr)| -v410(VarCurr,bitIndex1)| -$F|v410(VarCurr,bitIndex0)|$T.
% 94.93/94.31  0 [] v409(VarCurr)| -v410(VarCurr,bitIndex1)| -$F| -v410(VarCurr,bitIndex0)| -$T.
% 94.93/94.31  0 [] -v410(VarCurr,bitIndex0)|v403(VarCurr).
% 94.93/94.31  0 [] v410(VarCurr,bitIndex0)| -v403(VarCurr).
% 94.93/94.31  0 [] -v410(VarCurr,bitIndex1)|v391(VarCurr).
% 94.93/94.31  0 [] v410(VarCurr,bitIndex1)| -v391(VarCurr).
% 94.93/94.31  0 [] -range_4_0(B)| -v385(constB0,B)|$F.
% 94.93/94.31  0 [] -range_4_0(B)|v385(constB0,B)| -$F.
% 94.93/94.31  0 [] -v407(VarCurr)| -v408(VarCurr,bitIndex1)|$F.
% 94.93/94.31  0 [] -v407(VarCurr)|v408(VarCurr,bitIndex1)| -$F.
% 94.93/94.31  0 [] -v407(VarCurr)| -v408(VarCurr,bitIndex0)|$F.
% 94.93/94.31  0 [] -v407(VarCurr)|v408(VarCurr,bitIndex0)| -$F.
% 94.93/94.31  0 [] v407(VarCurr)|v408(VarCurr,bitIndex1)|$F|v408(VarCurr,bitIndex0).
% 94.93/94.31  0 [] v407(VarCurr)| -v408(VarCurr,bitIndex1)| -$F| -v408(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -v408(VarCurr,bitIndex0)|v403(VarCurr).
% 94.93/94.31  0 [] v408(VarCurr,bitIndex0)| -v403(VarCurr).
% 94.93/94.31  0 [] -v408(VarCurr,bitIndex1)|v391(VarCurr).
% 94.93/94.31  0 [] v408(VarCurr,bitIndex1)| -v391(VarCurr).
% 94.93/94.31  0 [] -v403(VarCurr)|v332(VarCurr).
% 94.93/94.31  0 [] v403(VarCurr)| -v332(VarCurr).
% 94.93/94.31  0 [] -v391(VarCurr)|v393(VarCurr).
% 94.93/94.31  0 [] v391(VarCurr)| -v393(VarCurr).
% 94.93/94.31  0 [] -v393(VarCurr)|v395(VarCurr).
% 94.93/94.31  0 [] v393(VarCurr)| -v395(VarCurr).
% 94.93/94.31  0 [] -v395(VarCurr)|v397(VarCurr).
% 94.93/94.31  0 [] v395(VarCurr)| -v397(VarCurr).
% 94.93/94.31  0 [] -v397(VarCurr)|v399(VarCurr).
% 94.93/94.31  0 [] v397(VarCurr)| -v399(VarCurr).
% 94.93/94.31  0 [] -v399(VarCurr)|v401(VarCurr).
% 94.93/94.31  0 [] v399(VarCurr)| -v401(VarCurr).
% 94.93/94.31  0 [] -v387(VarCurr)|v336(VarCurr).
% 94.93/94.31  0 [] v387(VarCurr)| -v336(VarCurr).
% 94.93/94.31  0 [] -v354(VarCurr)|v356(VarCurr).
% 94.93/94.31  0 [] v354(VarCurr)| -v356(VarCurr).
% 94.93/94.31  0 [] v374(VarCurr)| -v356(VarCurr)|$F.
% 94.93/94.31  0 [] v374(VarCurr)|v356(VarCurr)| -$F.
% 94.93/94.31  0 [] -v374(VarCurr)| -v356(VarCurr)|$T.
% 94.93/94.31  0 [] -v374(VarCurr)|v356(VarCurr)| -$T.
% 94.93/94.31  0 [] -v374(VarCurr)|v375(VarCurr).
% 94.93/94.31  0 [] -v374(VarCurr)|v366(VarCurr).
% 94.93/94.31  0 [] v374(VarCurr)| -v375(VarCurr)| -v366(VarCurr).
% 94.93/94.31  0 [] v375(VarCurr)|v358(VarCurr,bitIndex8).
% 94.93/94.31  0 [] -v375(VarCurr)| -v358(VarCurr,bitIndex8).
% 94.93/94.31  0 [] -v366(VarCurr)|v368(VarCurr).
% 94.93/94.31  0 [] v366(VarCurr)| -v368(VarCurr).
% 94.93/94.31  0 [] -v368(VarCurr)|v370(VarCurr).
% 94.93/94.31  0 [] v368(VarCurr)| -v370(VarCurr).
% 94.93/94.31  0 [] -v370(VarCurr)|v372(VarCurr).
% 94.93/94.31  0 [] v370(VarCurr)| -v372(VarCurr).
% 94.93/94.31  0 [] -v358(VarCurr,bitIndex8)|v360(VarCurr,bitIndex8).
% 94.93/94.31  0 [] v358(VarCurr,bitIndex8)| -v360(VarCurr,bitIndex8).
% 94.93/94.31  0 [] -v360(VarCurr,bitIndex8)|v362(VarCurr,bitIndex8).
% 94.93/94.31  0 [] v360(VarCurr,bitIndex8)| -v362(VarCurr,bitIndex8).
% 94.93/94.31  0 [] -v362(VarCurr,bitIndex8)|v364(VarCurr,bitIndex8).
% 94.93/94.31  0 [] v362(VarCurr,bitIndex8)| -v364(VarCurr,bitIndex8).
% 94.93/94.31  0 [] -v350(VarCurr)|v336(VarCurr).
% 94.93/94.31  0 [] v350(VarCurr)| -v336(VarCurr).
% 94.93/94.31  0 [] -v336(VarCurr)|v338(VarCurr).
% 94.93/94.31  0 [] v336(VarCurr)| -v338(VarCurr).
% 94.93/94.31  0 [] -v338(VarCurr)|v340(VarCurr).
% 94.93/94.31  0 [] v338(VarCurr)| -v340(VarCurr).
% 94.93/94.31  0 [] -v340(VarCurr)|v16(VarCurr).
% 94.93/94.31  0 [] v340(VarCurr)| -v16(VarCurr).
% 94.93/94.31  0 [] -v99(VarCurr)|v101(VarCurr).
% 94.93/94.31  0 [] v99(VarCurr)| -v101(VarCurr).
% 94.93/94.31  0 [] v101(VarCurr)|v103(VarCurr).
% 94.93/94.31  0 [] -v101(VarCurr)| -v103(VarCurr).
% 94.93/94.31  0 [] -v103(VarCurr)|v105(VarCurr).
% 94.93/94.31  0 [] v103(VarCurr)| -v105(VarCurr).
% 94.93/94.31  0 [] -v105(VarCurr)|v107(VarCurr).
% 94.93/94.31  0 [] v105(VarCurr)| -v107(VarCurr).
% 94.93/94.31  0 [] -v107(VarCurr)| -v109(VarCurr,bitIndex3)|$F.
% 94.93/94.31  0 [] -v107(VarCurr)|v109(VarCurr,bitIndex3)| -$F.
% 94.93/94.31  0 [] -v107(VarCurr)| -v109(VarCurr,bitIndex2)|$F.
% 94.93/94.31  0 [] -v107(VarCurr)|v109(VarCurr,bitIndex2)| -$F.
% 94.93/94.31  0 [] -v107(VarCurr)| -v109(VarCurr,bitIndex1)|$F.
% 94.93/94.31  0 [] -v107(VarCurr)|v109(VarCurr,bitIndex1)| -$F.
% 94.93/94.31  0 [] -v107(VarCurr)| -v109(VarCurr,bitIndex0)|$F.
% 94.93/94.31  0 [] -v107(VarCurr)|v109(VarCurr,bitIndex0)| -$F.
% 94.93/94.31  0 [] v107(VarCurr)|v109(VarCurr,bitIndex3)|$F|v109(VarCurr,bitIndex2)|v109(VarCurr,bitIndex1)|v109(VarCurr,bitIndex0).
% 94.93/94.31  0 [] v107(VarCurr)| -v109(VarCurr,bitIndex3)| -$F| -v109(VarCurr,bitIndex2)| -v109(VarCurr,bitIndex1)| -v109(VarCurr,bitIndex0).
% 94.93/94.31  0 [] -nextState(VarCurr,VarNext)|v291(VarNext)| -range_3_0(B)| -v109(VarNext,B)|v109(VarCurr,B).
% 94.93/94.31  0 [] -nextState(VarCurr,VarNext)|v291(VarNext)| -range_3_0(B)|v109(VarNext,B)| -v109(VarCurr,B).
% 94.93/94.31  0 [] -v291(VarNext)| -range_3_0(B)| -v109(VarNext,B)|v301(VarNext,B).
% 94.93/94.31  0 [] -v291(VarNext)| -range_3_0(B)|v109(VarNext,B)| -v301(VarNext,B).
% 94.93/94.31  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)| -v301(VarNext,B)|v299(VarCurr,B).
% 94.93/94.31  0 [] -nextState(VarCurr,VarNext)| -range_3_0(B)|v301(VarNext,B)| -v299(VarCurr,B).
% 94.93/94.32  0 [] v302(VarCurr)| -range_3_0(B)| -v299(VarCurr,B)|v111(VarCurr,B).
% 94.93/94.32  0 [] v302(VarCurr)| -range_3_0(B)|v299(VarCurr,B)| -v111(VarCurr,B).
% 94.93/94.32  0 [] -v302(VarCurr)| -range_3_0(B)| -v299(VarCurr,B)|$F.
% 94.93/94.32  0 [] -v302(VarCurr)| -range_3_0(B)|v299(VarCurr,B)| -$F.
% 94.93/94.32  0 [] v302(VarCurr)|v10(VarCurr).
% 94.93/94.32  0 [] -v302(VarCurr)| -v10(VarCurr).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)| -v291(VarNext)|v292(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)|v291(VarNext)| -v292(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)| -v292(VarNext)|v293(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)| -v292(VarNext)|v286(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)|v292(VarNext)| -v293(VarNext)| -v286(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)|v293(VarNext)|v295(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)| -v293(VarNext)| -v295(VarNext).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)| -v295(VarNext)|v286(VarCurr).
% 94.93/94.32  0 [] -nextState(VarCurr,VarNext)|v295(VarNext)| -v286(VarCurr).
% 94.93/94.32  0 [] -v286(VarCurr)|v288(VarCurr).
% 94.93/94.32  0 [] v286(VarCurr)| -v288(VarCurr).
% 94.93/94.32  0 [] -v288(VarCurr)|v197(VarCurr).
% 94.93/94.32  0 [] v288(VarCurr)| -v197(VarCurr).
% 94.93/94.32  0 [] v223(VarCurr)|v225(VarCurr)|v260(VarCurr)| -range_3_0(B)| -v111(VarCurr,B)|v109(VarCurr,B).
% 94.93/94.32  0 [] v223(VarCurr)|v225(VarCurr)|v260(VarCurr)| -range_3_0(B)|v111(VarCurr,B)| -v109(VarCurr,B).
% 94.93/94.32  0 [] -v260(VarCurr)| -range_3_0(B)| -v111(VarCurr,B)|v262(VarCurr,B).
% 94.93/94.32  0 [] -v260(VarCurr)| -range_3_0(B)|v111(VarCurr,B)| -v262(VarCurr,B).
% 94.93/94.32  0 [] -v225(VarCurr)| -range_3_0(B)| -v111(VarCurr,B)|v227(VarCurr,B).
% 94.93/94.32  0 [] -v225(VarCurr)| -range_3_0(B)|v111(VarCurr,B)| -v227(VarCurr,B).
% 94.93/94.32  0 [] -v223(VarCurr)| -range_3_0(B)| -v111(VarCurr,B)|v109(VarCurr,B).
% 94.93/94.32  0 [] -v223(VarCurr)| -range_3_0(B)|v111(VarCurr,B)| -v109(VarCurr,B).
% 94.93/94.32  0 [] -v283(VarCurr)| -v284(VarCurr,bitIndex1)|$T.
% 94.93/94.32  0 [] -v283(VarCurr)|v284(VarCurr,bitIndex1)| -$T.
% 94.93/94.32  0 [] -v283(VarCurr)| -v284(VarCurr,bitIndex0)|$T.
% 94.93/94.32  0 [] -v283(VarCurr)|v284(VarCurr,bitIndex0)| -$T.
% 94.93/94.32  0 [] v283(VarCurr)|v284(VarCurr,bitIndex1)|$T|v284(VarCurr,bitIndex0).
% 94.93/94.32  0 [] v283(VarCurr)| -v284(VarCurr,bitIndex1)| -$T| -v284(VarCurr,bitIndex0).
% 94.93/94.32  0 [] b11(bitIndex1).
% 94.93/94.32  0 [] b11(bitIndex0).
% 94.93/94.32  0 [] -v284(VarCurr,bitIndex0)|v23(VarCurr).
% 94.93/94.32  0 [] v284(VarCurr,bitIndex0)| -v23(VarCurr).
% 94.93/94.32  0 [] -v284(VarCurr,bitIndex1)|v113(VarCurr).
% 94.93/94.32  0 [] v284(VarCurr,bitIndex1)| -v113(VarCurr).
% 94.93/94.32  0 [] v263(VarCurr)| -range_3_0(B)| -v262(VarCurr,B)|v264(VarCurr,B).
% 94.93/94.32  0 [] v263(VarCurr)| -range_3_0(B)|v262(VarCurr,B)| -v264(VarCurr,B).
% 94.93/94.32  0 [] -v263(VarCurr)| -range_3_0(B)| -v262(VarCurr,B)|b0110(B).
% 94.93/94.32  0 [] -v263(VarCurr)| -range_3_0(B)|v262(VarCurr,B)| -b0110(B).
% 94.93/94.32  0 [] -v264(VarCurr,bitIndex0)|v280(VarCurr).
% 94.93/94.32  0 [] v264(VarCurr,bitIndex0)| -v280(VarCurr).
% 94.93/94.32  0 [] -v264(VarCurr,bitIndex1)|v278(VarCurr).
% 94.93/94.32  0 [] v264(VarCurr,bitIndex1)| -v278(VarCurr).
% 94.93/94.32  0 [] -v264(VarCurr,bitIndex2)|v273(VarCurr).
% 94.93/94.32  0 [] v264(VarCurr,bitIndex2)| -v273(VarCurr).
% 94.93/94.32  0 [] -v264(VarCurr,bitIndex3)|v266(VarCurr).
% 94.93/94.32  0 [] v264(VarCurr,bitIndex3)| -v266(VarCurr).
% 94.93/94.32  0 [] -v278(VarCurr)|v279(VarCurr).
% 94.93/94.32  0 [] -v278(VarCurr)|v282(VarCurr).
% 94.93/94.32  0 [] v278(VarCurr)| -v279(VarCurr)| -v282(VarCurr).
% 94.93/94.32  0 [] -v282(VarCurr)|v109(VarCurr,bitIndex0)|v109(VarCurr,bitIndex1).
% 94.93/94.32  0 [] v282(VarCurr)| -v109(VarCurr,bitIndex0).
% 94.93/94.32  0 [] v282(VarCurr)| -v109(VarCurr,bitIndex1).
% 94.93/94.32  0 [] -v279(VarCurr)|v280(VarCurr)|v281(VarCurr).
% 94.93/94.32  0 [] v279(VarCurr)| -v280(VarCurr).
% 94.93/94.32  0 [] v279(VarCurr)| -v281(VarCurr).
% 94.93/94.32  0 [] v281(VarCurr)|v109(VarCurr,bitIndex1).
% 94.93/94.32  0 [] -v281(VarCurr)| -v109(VarCurr,bitIndex1).
% 94.93/94.32  0 [] v280(VarCurr)|v109(VarCurr,bitIndex0).
% 94.93/94.32  0 [] -v280(VarCurr)| -v109(VarCurr,bitIndex0).
% 94.93/94.32  0 [] -v273(VarCurr)|v274(VarCurr).
% 94.93/94.32  0 [] -v273(VarCurr)|v277(VarCurr).
% 94.93/94.32  0 [] v273(VarCurr)| -v274(VarCurr)| -v277(VarCurr).
% 94.93/94.32  0 [] -v277(VarCurr)|v270(VarCurr)|v109(VarCurr,bitIndex2).
% 94.93/94.32  0 [] v277(VarCurr)| -v270(VarCurr).
% 94.93/94.32  0 [] v277(VarCurr)| -v109(VarCurr,bitIndex2).
% 94.93/94.32  0 [] -v274(VarCurr)|v275(VarCurr)|v276(VarCurr).
% 94.93/94.32  0 [] v274(VarCurr)| -v275(VarCurr).
% 94.93/94.32  0 [] v274(VarCurr)| -v276(VarCurr).
% 94.93/94.32  0 [] v276(VarCurr)|v109(VarCurr,bitIndex2).
% 94.93/94.32  0 [] -v276(VarCurr)| -v109(VarCurr,bitIndex2).
% 94.93/94.32  0 [] v275(VarCurr)|v270(VarCurr).
% 94.93/94.32  0 [] -v275(VarCurr)| -v270(VarCurr).
% 94.93/94.32  0 [] -v266(VarCurr)|v267(VarCurr).
% 94.93/94.32  0 [] -v266(VarCurr)|v272(VarCurr).
% 94.99/94.32  0 [] v266(VarCurr)| -v267(VarCurr)| -v272(VarCurr).
% 94.99/94.32  0 [] -v272(VarCurr)|v269(VarCurr)|v109(VarCurr,bitIndex3).
% 94.99/94.32  0 [] v272(VarCurr)| -v269(VarCurr).
% 94.99/94.32  0 [] v272(VarCurr)| -v109(VarCurr,bitIndex3).
% 94.99/94.32  0 [] -v267(VarCurr)|v268(VarCurr)|v271(VarCurr).
% 94.99/94.32  0 [] v267(VarCurr)| -v268(VarCurr).
% 94.99/94.32  0 [] v267(VarCurr)| -v271(VarCurr).
% 94.99/94.32  0 [] v271(VarCurr)|v109(VarCurr,bitIndex3).
% 94.99/94.32  0 [] -v271(VarCurr)| -v109(VarCurr,bitIndex3).
% 94.99/94.32  0 [] v268(VarCurr)|v269(VarCurr).
% 94.99/94.32  0 [] -v268(VarCurr)| -v269(VarCurr).
% 94.99/94.32  0 [] -v269(VarCurr)|v270(VarCurr).
% 94.99/94.32  0 [] -v269(VarCurr)|v109(VarCurr,bitIndex2).
% 94.99/94.32  0 [] v269(VarCurr)| -v270(VarCurr)| -v109(VarCurr,bitIndex2).
% 94.99/94.32  0 [] -v270(VarCurr)|v109(VarCurr,bitIndex0).
% 94.99/94.32  0 [] -v270(VarCurr)|v109(VarCurr,bitIndex1).
% 94.99/94.32  0 [] v270(VarCurr)| -v109(VarCurr,bitIndex0)| -v109(VarCurr,bitIndex1).
% 94.99/94.32  0 [] -v263(VarCurr)| -v109(VarCurr,bitIndex3)|$F.
% 94.99/94.32  0 [] -v263(VarCurr)|v109(VarCurr,bitIndex3)| -$F.
% 94.99/94.32  0 [] -v263(VarCurr)| -v109(VarCurr,bitIndex2)|$T.
% 94.99/94.32  0 [] -v263(VarCurr)|v109(VarCurr,bitIndex2)| -$T.
% 94.99/94.32  0 [] -v263(VarCurr)| -v109(VarCurr,bitIndex1)|$T.
% 94.99/94.32  0 [] -v263(VarCurr)|v109(VarCurr,bitIndex1)| -$T.
% 94.99/94.32  0 [] -v263(VarCurr)| -v109(VarCurr,bitIndex0)|$F.
% 94.99/94.32  0 [] -v263(VarCurr)|v109(VarCurr,bitIndex0)| -$F.
% 94.99/94.32  0 [] v263(VarCurr)|v109(VarCurr,bitIndex3)|$F|v109(VarCurr,bitIndex2)|$T|v109(VarCurr,bitIndex1)|v109(VarCurr,bitIndex0).
% 94.99/94.32  0 [] v263(VarCurr)|v109(VarCurr,bitIndex3)|$F| -v109(VarCurr,bitIndex2)| -$T| -v109(VarCurr,bitIndex1)|v109(VarCurr,bitIndex0).
% 94.99/94.32  0 [] v263(VarCurr)| -v109(VarCurr,bitIndex3)| -$F|v109(VarCurr,bitIndex2)|$T|v109(VarCurr,bitIndex1)| -v109(VarCurr,bitIndex0).
% 94.99/94.32  0 [] v263(VarCurr)| -v109(VarCurr,bitIndex3)| -$F| -v109(VarCurr,bitIndex2)| -$T| -v109(VarCurr,bitIndex1)| -v109(VarCurr,bitIndex0).
% 94.99/94.32  0 [] -b0110(bitIndex3).
% 94.99/94.32  0 [] b0110(bitIndex2).
% 94.99/94.32  0 [] b0110(bitIndex1).
% 94.99/94.32  0 [] -b0110(bitIndex0).
% 94.99/94.32  0 [] -v260(VarCurr)| -v261(VarCurr,bitIndex1)|$T.
% 94.99/94.32  0 [] -v260(VarCurr)|v261(VarCurr,bitIndex1)| -$T.
% 94.99/94.32  0 [] -v260(VarCurr)| -v261(VarCurr,bitIndex0)|$F.
% 94.99/94.32  0 [] -v260(VarCurr)|v261(VarCurr,bitIndex0)| -$F.
% 94.99/94.32  0 [] v260(VarCurr)|v261(VarCurr,bitIndex1)|$T|v261(VarCurr,bitIndex0)|$F.
% 94.99/94.32  0 [] v260(VarCurr)|v261(VarCurr,bitIndex1)|$T| -v261(VarCurr,bitIndex0)| -$F.
% 94.99/94.32  0 [] v260(VarCurr)| -v261(VarCurr,bitIndex1)| -$T|v261(VarCurr,bitIndex0)|$F.
% 94.99/94.32  0 [] v260(VarCurr)| -v261(VarCurr,bitIndex1)| -$T| -v261(VarCurr,bitIndex0)| -$F.
% 94.99/94.32  0 [] b10(bitIndex1).
% 94.99/94.32  0 [] -b10(bitIndex0).
% 94.99/94.32  0 [] -v261(VarCurr,bitIndex0)|v23(VarCurr).
% 94.99/94.32  0 [] v261(VarCurr,bitIndex0)| -v23(VarCurr).
% 94.99/94.32  0 [] -v261(VarCurr,bitIndex1)|v113(VarCurr).
% 94.99/94.32  0 [] v261(VarCurr,bitIndex1)| -v113(VarCurr).
% 94.99/94.32  0 [] v228(VarCurr)| -range_31_0(B)| -v227(VarCurr,B)|v229(VarCurr,B).
% 94.99/94.32  0 [] v228(VarCurr)| -range_31_0(B)|v227(VarCurr,B)| -v229(VarCurr,B).
% 94.99/94.32  0 [] -v228(VarCurr)| -range_31_0(B)| -v227(VarCurr,B)|$F.
% 94.99/94.32  0 [] -v228(VarCurr)| -range_31_0(B)|v227(VarCurr,B)| -$F.
% 94.99/94.32  0 [] -range_31_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B|bitIndex7=B|bitIndex8=B|bitIndex9=B|bitIndex10=B|bitIndex11=B|bitIndex12=B|bitIndex13=B|bitIndex14=B|bitIndex15=B|bitIndex16=B|bitIndex17=B|bitIndex18=B|bitIndex19=B|bitIndex20=B|bitIndex21=B|bitIndex22=B|bitIndex23=B|bitIndex24=B|bitIndex25=B|bitIndex26=B|bitIndex27=B|bitIndex28=B|bitIndex29=B|bitIndex30=B|bitIndex31=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex0!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex1!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex2!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex3!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex4!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex5!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex6!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex7!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex8!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex9!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex10!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex11!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex12!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex13!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex14!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex15!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex16!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex17!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex18!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex19!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex20!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex21!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex22!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex23!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex24!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex25!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex26!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex27!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex28!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex29!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex30!=B.
% 94.99/94.32  0 [] range_31_0(B)|bitIndex31!=B.
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex31).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex30).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex29).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex28).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex27).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex26).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex25).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex24).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex23).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex22).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex21).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex20).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex19).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex18).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex17).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex16).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex15).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex14).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex13).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex12).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex11).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex10).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex9).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex8).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex7).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex6).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex5).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex4).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex3).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex2).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex1).
% 94.99/94.32  0 [] -b00000000000000000000000000000000(bitIndex0).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex5)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex5)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex6)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex6)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex7)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex7)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex8)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex8)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex9)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex9)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex10)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex10)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex11)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex11)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex12)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex12)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex13)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex13)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex14)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex14)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex15)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex15)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex16)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex16)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex17)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex17)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex18)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex18)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex19)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex19)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex20)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex20)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex21)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex21)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex22)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex22)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex23)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex23)| -v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] -v229(VarCurr,bitIndex24)|v230(VarCurr,bitIndex4).
% 94.99/94.32  0 [] v229(VarCurr,bitIndex24)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex25)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex25)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex26)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex26)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex27)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex27)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex28)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex28)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex29)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex29)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex30)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex30)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v229(VarCurr,bitIndex31)|v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v229(VarCurr,bitIndex31)| -v230(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -range_4_0(B)| -v229(VarCurr,B)|v230(VarCurr,B).
% 94.99/94.33  0 [] -range_4_0(B)|v229(VarCurr,B)| -v230(VarCurr,B).
% 94.99/94.33  0 [] -v230(VarCurr,bitIndex0)|v258(VarCurr).
% 94.99/94.33  0 [] v230(VarCurr,bitIndex0)| -v258(VarCurr).
% 94.99/94.33  0 [] -v230(VarCurr,bitIndex1)|v256(VarCurr).
% 94.99/94.33  0 [] v230(VarCurr,bitIndex1)| -v256(VarCurr).
% 94.99/94.33  0 [] -v230(VarCurr,bitIndex2)|v252(VarCurr).
% 94.99/94.33  0 [] v230(VarCurr,bitIndex2)| -v252(VarCurr).
% 94.99/94.33  0 [] -v230(VarCurr,bitIndex3)|v248(VarCurr).
% 94.99/94.33  0 [] v230(VarCurr,bitIndex3)| -v248(VarCurr).
% 94.99/94.33  0 [] -v230(VarCurr,bitIndex4)|v232(VarCurr).
% 94.99/94.33  0 [] v230(VarCurr,bitIndex4)| -v232(VarCurr).
% 94.99/94.33  0 [] -v256(VarCurr)|v257(VarCurr).
% 94.99/94.33  0 [] -v256(VarCurr)|v259(VarCurr).
% 94.99/94.33  0 [] v256(VarCurr)| -v257(VarCurr)| -v259(VarCurr).
% 94.99/94.33  0 [] -v259(VarCurr)|v236(VarCurr,bitIndex0)|v243(VarCurr).
% 94.99/94.33  0 [] v259(VarCurr)| -v236(VarCurr,bitIndex0).
% 94.99/94.33  0 [] v259(VarCurr)| -v243(VarCurr).
% 94.99/94.33  0 [] -v257(VarCurr)|v258(VarCurr)|v236(VarCurr,bitIndex1).
% 94.99/94.33  0 [] v257(VarCurr)| -v258(VarCurr).
% 94.99/94.33  0 [] v257(VarCurr)| -v236(VarCurr,bitIndex1).
% 94.99/94.33  0 [] v258(VarCurr)|v236(VarCurr,bitIndex0).
% 94.99/94.33  0 [] -v258(VarCurr)| -v236(VarCurr,bitIndex0).
% 94.99/94.33  0 [] -v252(VarCurr)|v253(VarCurr).
% 94.99/94.33  0 [] -v252(VarCurr)|v255(VarCurr).
% 94.99/94.33  0 [] v252(VarCurr)| -v253(VarCurr)| -v255(VarCurr).
% 94.99/94.33  0 [] -v255(VarCurr)|v241(VarCurr)|v244(VarCurr).
% 94.99/94.33  0 [] v255(VarCurr)| -v241(VarCurr).
% 94.99/94.33  0 [] v255(VarCurr)| -v244(VarCurr).
% 94.99/94.33  0 [] -v253(VarCurr)|v254(VarCurr)|v236(VarCurr,bitIndex2).
% 94.99/94.33  0 [] v253(VarCurr)| -v254(VarCurr).
% 94.99/94.33  0 [] v253(VarCurr)| -v236(VarCurr,bitIndex2).
% 94.99/94.33  0 [] v254(VarCurr)|v241(VarCurr).
% 94.99/94.33  0 [] -v254(VarCurr)| -v241(VarCurr).
% 94.99/94.33  0 [] -v248(VarCurr)|v249(VarCurr).
% 94.99/94.33  0 [] -v248(VarCurr)|v251(VarCurr).
% 94.99/94.33  0 [] v248(VarCurr)| -v249(VarCurr)| -v251(VarCurr).
% 94.99/94.33  0 [] -v251(VarCurr)|v239(VarCurr)|v245(VarCurr).
% 94.99/94.33  0 [] v251(VarCurr)| -v239(VarCurr).
% 94.99/94.33  0 [] v251(VarCurr)| -v245(VarCurr).
% 94.99/94.33  0 [] -v249(VarCurr)|v250(VarCurr)|v236(VarCurr,bitIndex3).
% 94.99/94.33  0 [] v249(VarCurr)| -v250(VarCurr).
% 94.99/94.33  0 [] v249(VarCurr)| -v236(VarCurr,bitIndex3).
% 94.99/94.33  0 [] v250(VarCurr)|v239(VarCurr).
% 94.99/94.33  0 [] -v250(VarCurr)| -v239(VarCurr).
% 94.99/94.33  0 [] -v232(VarCurr)|v233(VarCurr).
% 94.99/94.33  0 [] -v232(VarCurr)|v246(VarCurr).
% 94.99/94.33  0 [] v232(VarCurr)| -v233(VarCurr)| -v246(VarCurr).
% 94.99/94.33  0 [] -v246(VarCurr)|v235(VarCurr)|v247(VarCurr).
% 94.99/94.33  0 [] v246(VarCurr)| -v235(VarCurr).
% 94.99/94.33  0 [] v246(VarCurr)| -v247(VarCurr).
% 94.99/94.33  0 [] v247(VarCurr)|v236(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v247(VarCurr)| -v236(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -v233(VarCurr)|v234(VarCurr)|v236(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v233(VarCurr)| -v234(VarCurr).
% 94.99/94.33  0 [] v233(VarCurr)| -v236(VarCurr,bitIndex4).
% 94.99/94.33  0 [] v234(VarCurr)|v235(VarCurr).
% 94.99/94.33  0 [] -v234(VarCurr)| -v235(VarCurr).
% 94.99/94.33  0 [] -v235(VarCurr)|v236(VarCurr,bitIndex3)|v238(VarCurr).
% 94.99/94.33  0 [] v235(VarCurr)| -v236(VarCurr,bitIndex3).
% 94.99/94.33  0 [] v235(VarCurr)| -v238(VarCurr).
% 94.99/94.33  0 [] -v238(VarCurr)|v239(VarCurr).
% 94.99/94.33  0 [] -v238(VarCurr)|v245(VarCurr).
% 94.99/94.33  0 [] v238(VarCurr)| -v239(VarCurr)| -v245(VarCurr).
% 94.99/94.33  0 [] v245(VarCurr)|v236(VarCurr,bitIndex3).
% 94.99/94.33  0 [] -v245(VarCurr)| -v236(VarCurr,bitIndex3).
% 94.99/94.33  0 [] -v239(VarCurr)|v236(VarCurr,bitIndex2)|v240(VarCurr).
% 94.99/94.33  0 [] v239(VarCurr)| -v236(VarCurr,bitIndex2).
% 94.99/94.33  0 [] v239(VarCurr)| -v240(VarCurr).
% 94.99/94.33  0 [] -v240(VarCurr)|v241(VarCurr).
% 94.99/94.33  0 [] -v240(VarCurr)|v244(VarCurr).
% 94.99/94.33  0 [] v240(VarCurr)| -v241(VarCurr)| -v244(VarCurr).
% 94.99/94.33  0 [] v244(VarCurr)|v236(VarCurr,bitIndex2).
% 94.99/94.33  0 [] -v244(VarCurr)| -v236(VarCurr,bitIndex2).
% 94.99/94.33  0 [] -v241(VarCurr)|v236(VarCurr,bitIndex1)|v242(VarCurr).
% 94.99/94.33  0 [] v241(VarCurr)| -v236(VarCurr,bitIndex1).
% 94.99/94.33  0 [] v241(VarCurr)| -v242(VarCurr).
% 94.99/94.33  0 [] -v242(VarCurr)|v236(VarCurr,bitIndex0).
% 94.99/94.33  0 [] -v242(VarCurr)|v243(VarCurr).
% 94.99/94.33  0 [] v242(VarCurr)| -v236(VarCurr,bitIndex0)| -v243(VarCurr).
% 94.99/94.33  0 [] v243(VarCurr)|v236(VarCurr,bitIndex1).
% 94.99/94.33  0 [] -v243(VarCurr)| -v236(VarCurr,bitIndex1).
% 94.99/94.33  0 [] -v236(VarCurr,bitIndex4).
% 94.99/94.33  0 [] -range_3_0(B)| -v236(VarCurr,B)|v109(VarCurr,B).
% 94.99/94.33  0 [] -range_3_0(B)|v236(VarCurr,B)| -v109(VarCurr,B).
% 94.99/94.33  0 [] -v228(VarCurr)| -v109(VarCurr,bitIndex3)|$F.
% 94.99/94.33  0 [] -v228(VarCurr)|v109(VarCurr,bitIndex3)| -$F.
% 94.99/94.33  0 [] -v228(VarCurr)| -v109(VarCurr,bitIndex2)|$F.
% 94.99/94.33  0 [] -v228(VarCurr)|v109(VarCurr,bitIndex2)| -$F.
% 94.99/94.33  0 [] -v228(VarCurr)| -v109(VarCurr,bitIndex1)|$F.
% 94.99/94.33  0 [] -v228(VarCurr)|v109(VarCurr,bitIndex1)| -$F.
% 94.99/94.33  0 [] -v228(VarCurr)| -v109(VarCurr,bitIndex0)|$F.
% 94.99/94.33  0 [] -v228(VarCurr)|v109(VarCurr,bitIndex0)| -$F.
% 94.99/94.33  0 [] v228(VarCurr)|v109(VarCurr,bitIndex3)|$F|v109(VarCurr,bitIndex2)|v109(VarCurr,bitIndex1)|v109(VarCurr,bitIndex0).
% 94.99/94.33  0 [] v228(VarCurr)| -v109(VarCurr,bitIndex3)| -$F| -v109(VarCurr,bitIndex2)| -v109(VarCurr,bitIndex1)| -v109(VarCurr,bitIndex0).
% 94.99/94.33  0 [] -v225(VarCurr)| -v226(VarCurr,bitIndex1)|$F.
% 94.99/94.33  0 [] -v225(VarCurr)|v226(VarCurr,bitIndex1)| -$F.
% 94.99/94.33  0 [] -v225(VarCurr)| -v226(VarCurr,bitIndex0)|$T.
% 94.99/94.33  0 [] -v225(VarCurr)|v226(VarCurr,bitIndex0)| -$T.
% 94.99/94.33  0 [] v225(VarCurr)|v226(VarCurr,bitIndex1)|$F|v226(VarCurr,bitIndex0)|$T.
% 94.99/94.33  0 [] v225(VarCurr)|v226(VarCurr,bitIndex1)|$F| -v226(VarCurr,bitIndex0)| -$T.
% 94.99/94.33  0 [] v225(VarCurr)| -v226(VarCurr,bitIndex1)| -$F|v226(VarCurr,bitIndex0)|$T.
% 94.99/94.33  0 [] v225(VarCurr)| -v226(VarCurr,bitIndex1)| -$F| -v226(VarCurr,bitIndex0)| -$T.
% 94.99/94.33  0 [] -b01(bitIndex1).
% 94.99/94.33  0 [] b01(bitIndex0).
% 94.99/94.33  0 [] -v226(VarCurr,bitIndex0)|v23(VarCurr).
% 94.99/94.33  0 [] v226(VarCurr,bitIndex0)| -v23(VarCurr).
% 94.99/94.33  0 [] -v226(VarCurr,bitIndex1)|v113(VarCurr).
% 94.99/94.33  0 [] v226(VarCurr,bitIndex1)| -v113(VarCurr).
% 94.99/94.33  0 [] -range_3_0(B)| -v109(constB0,B)|$F.
% 94.99/94.33  0 [] -range_3_0(B)|v109(constB0,B)| -$F.
% 94.99/94.33  0 [] -v223(VarCurr)| -v224(VarCurr,bitIndex1)|$F.
% 94.99/94.33  0 [] -v223(VarCurr)|v224(VarCurr,bitIndex1)| -$F.
% 94.99/94.33  0 [] -v223(VarCurr)| -v224(VarCurr,bitIndex0)|$F.
% 94.99/94.33  0 [] -v223(VarCurr)|v224(VarCurr,bitIndex0)| -$F.
% 94.99/94.33  0 [] v223(VarCurr)|v224(VarCurr,bitIndex1)|$F|v224(VarCurr,bitIndex0).
% 94.99/94.33  0 [] v223(VarCurr)| -v224(VarCurr,bitIndex1)| -$F| -v224(VarCurr,bitIndex0).
% 94.99/94.33  0 [] -b00(bitIndex1).
% 94.99/94.33  0 [] -b00(bitIndex0).
% 94.99/94.33  0 [] -v224(VarCurr,bitIndex0)|v23(VarCurr).
% 94.99/94.33  0 [] v224(VarCurr,bitIndex0)| -v23(VarCurr).
% 94.99/94.33  0 [] -v224(VarCurr,bitIndex1)|v113(VarCurr).
% 94.99/94.33  0 [] v224(VarCurr,bitIndex1)| -v113(VarCurr).
% 94.99/94.33  0 [] -v113(VarCurr)|v115(VarCurr).
% 94.99/94.33  0 [] v113(VarCurr)| -v115(VarCurr).
% 94.99/94.33  0 [] -v115(VarCurr)|v117(VarCurr).
% 94.99/94.33  0 [] v115(VarCurr)| -v117(VarCurr).
% 94.99/94.33  0 [] -v117(VarCurr)|v119(VarCurr).
% 94.99/94.33  0 [] v117(VarCurr)| -v119(VarCurr).
% 94.99/94.33  0 [] -v119(VarCurr)|v121(VarCurr).
% 94.99/94.33  0 [] v119(VarCurr)| -v121(VarCurr).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v200(VarNext)| -v121(VarNext)|v121(VarCurr).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v200(VarNext)|v121(VarNext)| -v121(VarCurr).
% 94.99/94.33  0 [] -v200(VarNext)| -v121(VarNext)|v210(VarNext).
% 94.99/94.33  0 [] -v200(VarNext)|v121(VarNext)| -v210(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v210(VarNext)|v208(VarCurr).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v210(VarNext)| -v208(VarCurr).
% 94.99/94.33  0 [] v211(VarCurr)| -v208(VarCurr)|v127(VarCurr).
% 94.99/94.33  0 [] v211(VarCurr)|v208(VarCurr)| -v127(VarCurr).
% 94.99/94.33  0 [] -v211(VarCurr)| -v208(VarCurr)|$F.
% 94.99/94.33  0 [] -v211(VarCurr)|v208(VarCurr)| -$F.
% 94.99/94.33  0 [] v211(VarCurr)|v123(VarCurr).
% 94.99/94.33  0 [] -v211(VarCurr)| -v123(VarCurr).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v200(VarNext)|v201(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v200(VarNext)| -v201(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v201(VarNext)|v202(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v201(VarNext)|v193(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v201(VarNext)| -v202(VarNext)| -v193(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v202(VarNext)|v204(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v202(VarNext)| -v204(VarNext).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)| -v204(VarNext)|v193(VarCurr).
% 94.99/94.33  0 [] -nextState(VarCurr,VarNext)|v204(VarNext)| -v193(VarCurr).
% 94.99/94.33  0 [] -v193(VarCurr)|v195(VarCurr).
% 94.99/94.33  0 [] v193(VarCurr)| -v195(VarCurr).
% 94.99/94.33  0 [] -v195(VarCurr)|v197(VarCurr).
% 94.99/94.34  0 [] v195(VarCurr)| -v197(VarCurr).
% 94.99/94.34  0 [] -v197(VarCurr)|v1(VarCurr).
% 94.99/94.34  0 [] v197(VarCurr)| -v1(VarCurr).
% 94.99/94.34  0 [] -v127(VarCurr)|v190(VarCurr).
% 94.99/94.34  0 [] -v127(VarCurr)|v178(VarCurr).
% 94.99/94.34  0 [] v127(VarCurr)| -v190(VarCurr)| -v178(VarCurr).
% 94.99/94.34  0 [] -v190(VarCurr)|v191(VarCurr).
% 94.99/94.34  0 [] -v190(VarCurr)|v139(VarCurr).
% 94.99/94.34  0 [] v190(VarCurr)| -v191(VarCurr)| -v139(VarCurr).
% 94.99/94.34  0 [] v191(VarCurr)|v129(VarCurr).
% 94.99/94.34  0 [] -v191(VarCurr)| -v129(VarCurr).
% 94.99/94.34  0 [] -v178(VarCurr)|v180(VarCurr).
% 94.99/94.34  0 [] v178(VarCurr)| -v180(VarCurr).
% 94.99/94.34  0 [] -v180(VarCurr)|v182(VarCurr).
% 94.99/94.34  0 [] v180(VarCurr)| -v182(VarCurr).
% 94.99/94.34  0 [] -v182(VarCurr)|v187(VarCurr)|v184(VarCurr,bitIndex2).
% 94.99/94.34  0 [] v182(VarCurr)| -v187(VarCurr).
% 94.99/94.34  0 [] v182(VarCurr)| -v184(VarCurr,bitIndex2).
% 94.99/94.34  0 [] -v187(VarCurr)|v184(VarCurr,bitIndex0)|v184(VarCurr,bitIndex1).
% 94.99/94.34  0 [] v187(VarCurr)| -v184(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v187(VarCurr)| -v184(VarCurr,bitIndex1).
% 94.99/94.34  0 [] -range_2_0(B)| -v184(constB0,B)|$T.
% 94.99/94.34  0 [] -range_2_0(B)|v184(constB0,B)| -$T.
% 94.99/94.34  0 [] -range_2_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B.
% 94.99/94.34  0 [] range_2_0(B)|bitIndex0!=B.
% 94.99/94.34  0 [] range_2_0(B)|bitIndex1!=B.
% 94.99/94.34  0 [] range_2_0(B)|bitIndex2!=B.
% 94.99/94.34  0 [] b111(bitIndex2).
% 94.99/94.34  0 [] b111(bitIndex1).
% 94.99/94.34  0 [] b111(bitIndex0).
% 94.99/94.34  0 [] -v139(VarCurr)|v141(VarCurr).
% 94.99/94.34  0 [] v139(VarCurr)| -v141(VarCurr).
% 94.99/94.34  0 [] -v141(VarCurr)|v143(VarCurr).
% 94.99/94.34  0 [] v141(VarCurr)| -v143(VarCurr).
% 94.99/94.34  0 [] v167(VarCurr)| -v143(VarCurr)|$F.
% 94.99/94.34  0 [] v167(VarCurr)|v143(VarCurr)| -$F.
% 94.99/94.34  0 [] -v167(VarCurr)| -v143(VarCurr)|$T.
% 94.99/94.34  0 [] -v167(VarCurr)|v143(VarCurr)| -$T.
% 94.99/94.34  0 [] -v167(VarCurr)|v168(VarCurr)|v176(VarCurr).
% 94.99/94.34  0 [] v167(VarCurr)| -v168(VarCurr).
% 94.99/94.34  0 [] v167(VarCurr)| -v176(VarCurr).
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex3)|$T.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex3)| -$T.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.99/94.34  0 [] -v176(VarCurr)| -v158(VarCurr,bitIndex0)|$T.
% 94.99/94.34  0 [] -v176(VarCurr)|v158(VarCurr,bitIndex0)| -$T.
% 94.99/94.34  0 [] v176(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$T|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v176(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$T|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v176(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|$T| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v176(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -$T| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] -b0001001(bitIndex6).
% 94.99/94.34  0 [] -b0001001(bitIndex5).
% 94.99/94.34  0 [] -b0001001(bitIndex4).
% 94.99/94.34  0 [] b0001001(bitIndex3).
% 94.99/94.34  0 [] -b0001001(bitIndex2).
% 94.99/94.34  0 [] -b0001001(bitIndex1).
% 94.99/94.34  0 [] b0001001(bitIndex0).
% 94.99/94.34  0 [] -v168(VarCurr)|v169(VarCurr)|v173(VarCurr).
% 94.99/94.34  0 [] v168(VarCurr)| -v169(VarCurr).
% 94.99/94.34  0 [] v168(VarCurr)| -v173(VarCurr).
% 94.99/94.34  0 [] -v173(VarCurr)|v174(VarCurr)|v175(VarCurr).
% 94.99/94.34  0 [] v173(VarCurr)| -v174(VarCurr).
% 94.99/94.34  0 [] v173(VarCurr)| -v175(VarCurr).
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.99/94.34  0 [] -v175(VarCurr)| -v158(VarCurr,bitIndex0)|$T.
% 94.99/94.34  0 [] -v175(VarCurr)|v158(VarCurr,bitIndex0)| -$T.
% 94.99/94.34  0 [] v175(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|$T|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v175(VarCurr)|v158(VarCurr,bitIndex6)|$F| -v158(VarCurr,bitIndex5)| -$T|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v175(VarCurr)| -v158(VarCurr,bitIndex6)| -$F|v158(VarCurr,bitIndex5)|$T| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v175(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -$T| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] -b0100001(bitIndex6).
% 94.99/94.34  0 [] b0100001(bitIndex5).
% 94.99/94.34  0 [] -b0100001(bitIndex4).
% 94.99/94.34  0 [] -b0100001(bitIndex3).
% 94.99/94.34  0 [] -b0100001(bitIndex2).
% 94.99/94.34  0 [] -b0100001(bitIndex1).
% 94.99/94.34  0 [] b0100001(bitIndex0).
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.99/94.34  0 [] -v174(VarCurr)| -v158(VarCurr,bitIndex0)|$T.
% 94.99/94.34  0 [] -v174(VarCurr)|v158(VarCurr,bitIndex0)| -$T.
% 94.99/94.34  0 [] v174(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0)|$T.
% 94.99/94.34  0 [] v174(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0)| -$T.
% 94.99/94.34  0 [] v174(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0)|$T.
% 94.99/94.34  0 [] v174(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0)| -$T.
% 94.99/94.34  0 [] -b0000001(bitIndex6).
% 94.99/94.34  0 [] -b0000001(bitIndex5).
% 94.99/94.34  0 [] -b0000001(bitIndex4).
% 94.99/94.34  0 [] -b0000001(bitIndex3).
% 94.99/94.34  0 [] -b0000001(bitIndex2).
% 94.99/94.34  0 [] -b0000001(bitIndex1).
% 94.99/94.34  0 [] b0000001(bitIndex0).
% 94.99/94.34  0 [] -v169(VarCurr)|v145(VarCurr,bitIndex0).
% 94.99/94.34  0 [] -v169(VarCurr)|v170(VarCurr).
% 94.99/94.34  0 [] v169(VarCurr)| -v145(VarCurr,bitIndex0)| -v170(VarCurr).
% 94.99/94.34  0 [] -v170(VarCurr)|v171(VarCurr)|v172(VarCurr).
% 94.99/94.34  0 [] v170(VarCurr)| -v171(VarCurr).
% 94.99/94.34  0 [] v170(VarCurr)| -v172(VarCurr).
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex5)|$T.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex5)| -$T.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.99/94.34  0 [] -v172(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.99/94.34  0 [] -v172(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.99/94.34  0 [] v172(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|$T|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.34  0 [] v172(VarCurr)|v158(VarCurr,bitIndex6)|$F| -v158(VarCurr,bitIndex5)| -$T|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.35  0 [] v172(VarCurr)| -v158(VarCurr,bitIndex6)| -$F|v158(VarCurr,bitIndex5)|$T| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.35  0 [] v172(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -$T| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.35  0 [] -b0100000(bitIndex6).
% 94.99/94.35  0 [] b0100000(bitIndex5).
% 94.99/94.35  0 [] -b0100000(bitIndex4).
% 94.99/94.35  0 [] -b0100000(bitIndex3).
% 94.99/94.35  0 [] -b0100000(bitIndex2).
% 94.99/94.35  0 [] -b0100000(bitIndex1).
% 94.99/94.35  0 [] -b0100000(bitIndex0).
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex6)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex6)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex5)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex5)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex4)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex4)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex3)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex3)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex2)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex2)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex1)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex1)| -$F.
% 94.99/94.35  0 [] -v171(VarCurr)| -v158(VarCurr,bitIndex0)|$F.
% 94.99/94.35  0 [] -v171(VarCurr)|v158(VarCurr,bitIndex0)| -$F.
% 94.99/94.35  0 [] v171(VarCurr)|v158(VarCurr,bitIndex6)|$F|v158(VarCurr,bitIndex5)|v158(VarCurr,bitIndex4)|v158(VarCurr,bitIndex3)|v158(VarCurr,bitIndex2)|v158(VarCurr,bitIndex1)|v158(VarCurr,bitIndex0).
% 94.99/94.35  0 [] v171(VarCurr)| -v158(VarCurr,bitIndex6)| -$F| -v158(VarCurr,bitIndex5)| -v158(VarCurr,bitIndex4)| -v158(VarCurr,bitIndex3)| -v158(VarCurr,bitIndex2)| -v158(VarCurr,bitIndex1)| -v158(VarCurr,bitIndex0).
% 94.99/94.35  0 [] -b0000000(bitIndex6).
% 94.99/94.35  0 [] -b0000000(bitIndex5).
% 94.99/94.35  0 [] -b0000000(bitIndex4).
% 94.99/94.35  0 [] -b0000000(bitIndex3).
% 94.99/94.35  0 [] -b0000000(bitIndex2).
% 94.99/94.35  0 [] -b0000000(bitIndex1).
% 94.99/94.35  0 [] -b0000000(bitIndex0).
% 94.99/94.35  0 [] -range_6_0(B)| -v158(VarCurr,B)|v160(VarCurr,B).
% 94.99/94.35  0 [] -range_6_0(B)|v158(VarCurr,B)| -v160(VarCurr,B).
% 94.99/94.35  0 [] -range_6_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B|bitIndex5=B|bitIndex6=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex0!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex1!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex2!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex3!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex4!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex5!=B.
% 94.99/94.35  0 [] range_6_0(B)|bitIndex6!=B.
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex6)|v149(VarCurr,bitIndex60).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex6)| -v149(VarCurr,bitIndex60).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex5)|v149(VarCurr,bitIndex59).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex5)| -v149(VarCurr,bitIndex59).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex4)|v149(VarCurr,bitIndex58).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex4)| -v149(VarCurr,bitIndex58).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex3)|v149(VarCurr,bitIndex57).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex3)| -v149(VarCurr,bitIndex57).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex2)|v149(VarCurr,bitIndex56).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex2)| -v149(VarCurr,bitIndex56).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex1)|v149(VarCurr,bitIndex55).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex1)| -v149(VarCurr,bitIndex55).
% 94.99/94.35  0 [] -v160(VarCurr,bitIndex0)|v149(VarCurr,bitIndex54).
% 94.99/94.35  0 [] v160(VarCurr,bitIndex0)| -v149(VarCurr,bitIndex54).
% 94.99/94.35  0 [] -range_60_54(B)| -v149(VarCurr,B)|v151(VarCurr,B).
% 94.99/94.35  0 [] -range_60_54(B)|v149(VarCurr,B)| -v151(VarCurr,B).
% 94.99/94.35  0 [] -range_60_54(B)| -v151(VarCurr,B)|v156(VarCurr,B).
% 94.99/94.35  0 [] -range_60_54(B)|v151(VarCurr,B)| -v156(VarCurr,B).
% 94.99/94.35  0 [] -range_60_54(B)|bitIndex54=B|bitIndex55=B|bitIndex56=B|bitIndex57=B|bitIndex58=B|bitIndex59=B|bitIndex60=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex54!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex55!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex56!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex57!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex58!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex59!=B.
% 94.99/94.35  0 [] range_60_54(B)|bitIndex60!=B.
% 94.99/94.35  0 [] -v145(VarCurr,bitIndex0)|v147(VarCurr,bitIndex0).
% 94.99/94.35  0 [] v145(VarCurr,bitIndex0)| -v147(VarCurr,bitIndex0).
% 94.99/94.35  0 [] -v147(VarCurr,bitIndex0)|v149(VarCurr,bitIndex12).
% 94.99/94.35  0 [] v147(VarCurr,bitIndex0)| -v149(VarCurr,bitIndex12).
% 94.99/94.35  0 [] -v149(VarCurr,bitIndex12)|v151(VarCurr,bitIndex12).
% 94.99/94.35  0 [] v149(VarCurr,bitIndex12)| -v151(VarCurr,bitIndex12).
% 94.99/94.35  0 [] -v151(VarCurr,bitIndex12)|v156(VarCurr,bitIndex12).
% 94.99/94.35  0 [] v151(VarCurr,bitIndex12)| -v156(VarCurr,bitIndex12).
% 94.99/94.35  0 [] -range_3_0(B)| -v155(constB0,B)|$F.
% 94.99/94.35  0 [] -range_3_0(B)|v155(constB0,B)| -$F.
% 94.99/94.35  0 [] -range_3_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B.
% 94.99/94.35  0 [] range_3_0(B)|bitIndex0!=B.
% 94.99/94.35  0 [] range_3_0(B)|bitIndex1!=B.
% 94.99/94.35  0 [] range_3_0(B)|bitIndex2!=B.
% 94.99/94.35  0 [] range_3_0(B)|bitIndex3!=B.
% 94.99/94.35  0 [] -b0000(bitIndex3).
% 94.99/94.35  0 [] -b0000(bitIndex2).
% 94.99/94.35  0 [] -b0000(bitIndex1).
% 94.99/94.35  0 [] -b0000(bitIndex0).
% 94.99/94.35  0 [] -v129(VarCurr)|v131(VarCurr).
% 94.99/94.35  0 [] v129(VarCurr)| -v131(VarCurr).
% 94.99/94.35  0 [] -v131(VarCurr)|v133(VarCurr).
% 94.99/94.35  0 [] v131(VarCurr)| -v133(VarCurr).
% 94.99/94.35  0 [] -v133(VarCurr)| -v135(VarCurr,bitIndex4)|$F.
% 94.99/94.35  0 [] -v133(VarCurr)|v135(VarCurr,bitIndex4)| -$F.
% 94.99/94.35  0 [] -v133(VarCurr)| -v135(VarCurr,bitIndex3)|$F.
% 94.99/94.35  0 [] -v133(VarCurr)|v135(VarCurr,bitIndex3)| -$F.
% 94.99/94.35  0 [] -v133(VarCurr)| -v135(VarCurr,bitIndex2)|$F.
% 94.99/94.35  0 [] -v133(VarCurr)|v135(VarCurr,bitIndex2)| -$F.
% 94.99/94.35  0 [] -v133(VarCurr)| -v135(VarCurr,bitIndex1)|$F.
% 94.99/94.35  0 [] -v133(VarCurr)|v135(VarCurr,bitIndex1)| -$F.
% 94.99/94.35  0 [] -v133(VarCurr)| -v135(VarCurr,bitIndex0)|$F.
% 94.99/94.35  0 [] -v133(VarCurr)|v135(VarCurr,bitIndex0)| -$F.
% 94.99/94.35  0 [] v133(VarCurr)|v135(VarCurr,bitIndex4)|$F|v135(VarCurr,bitIndex3)|v135(VarCurr,bitIndex2)|v135(VarCurr,bitIndex1)|v135(VarCurr,bitIndex0).
% 94.99/94.35  0 [] v133(VarCurr)| -v135(VarCurr,bitIndex4)| -$F| -v135(VarCurr,bitIndex3)| -v135(VarCurr,bitIndex2)| -v135(VarCurr,bitIndex1)| -v135(VarCurr,bitIndex0).
% 94.99/94.35  0 [] -range_4_0(B)| -v135(constB0,B)|$F.
% 94.99/94.35  0 [] -range_4_0(B)|v135(constB0,B)| -$F.
% 94.99/94.35  0 [] -range_4_0(B)|bitIndex0=B|bitIndex1=B|bitIndex2=B|bitIndex3=B|bitIndex4=B.
% 94.99/94.35  0 [] range_4_0(B)|bitIndex0!=B.
% 94.99/94.35  0 [] range_4_0(B)|bitIndex1!=B.
% 94.99/94.35  0 [] range_4_0(B)|bitIndex2!=B.
% 94.99/94.35  0 [] range_4_0(B)|bitIndex3!=B.
% 94.99/94.35  0 [] range_4_0(B)|bitIndex4!=B.
% 94.99/94.35  0 [] -b00000(bitIndex4).
% 94.99/94.35  0 [] -b00000(bitIndex3).
% 94.99/94.35  0 [] -b00000(bitIndex2).
% 94.99/94.35  0 [] -b00000(bitIndex1).
% 94.99/94.35  0 [] -b00000(bitIndex0).
% 94.99/94.35  0 [] -v123(VarCurr)|v125(VarCurr).
% 94.99/94.35  0 [] v123(VarCurr)| -v125(VarCurr).
% 94.99/94.35  0 [] -v125(VarCurr)|v14(VarCurr).
% 94.99/94.35  0 [] v125(VarCurr)| -v14(VarCurr).
% 94.99/94.35  0 [] -v58(VarCurr)|v60(VarCurr).
% 94.99/94.35  0 [] v58(VarCurr)| -v60(VarCurr).
% 94.99/94.35  0 [] -v60(VarCurr)|v62(VarCurr).
% 94.99/94.35  0 [] v60(VarCurr)| -v62(VarCurr).
% 94.99/94.35  0 [] -v62(VarCurr)|v64(VarCurr).
% 94.99/94.35  0 [] v62(VarCurr)| -v64(VarCurr).
% 94.99/94.35  0 [] -v64(VarCurr)|v16(VarCurr).
% 94.99/94.35  0 [] v64(VarCurr)| -v16(VarCurr).
% 94.99/94.35  0 [] -v33(VarCurr)|v12(VarCurr).
% 94.99/94.35  0 [] v33(VarCurr)| -v12(VarCurr).
% 94.99/94.35  0 [] -v10(VarCurr)|v12(VarCurr).
% 94.99/94.35  0 [] v10(VarCurr)| -v12(VarCurr).
% 94.99/94.35  0 [] -v12(VarCurr)|v14(VarCurr).
% 94.99/94.35  0 [] v12(VarCurr)| -v14(VarCurr).
% 94.99/94.35  0 [] -v14(VarCurr)|v16(VarCurr).
% 94.99/94.35  0 [] v14(VarCurr)| -v16(VarCurr).
% 94.99/94.35  0 [] -v16(VarCurr)|v18(VarCurr).
% 94.99/94.35  0 [] v16(VarCurr)| -v18(VarCurr).
% 94.99/94.35  end_of_list.
% 94.99/94.35  
% 94.99/94.35  SCAN INPUT: prop=0, horn=0, equality=1, symmetry=0, max_lits=68.
% 94.99/94.35  
% 94.99/94.35  This ia a non-Horn set with equality.  The strategy will be
% 94.99/94.35  Knuth-Bendix, ordered hyper_res, factoring, and unit
% 94.99/94.35  deletion, with positive clauses in sos and nonpositive
% 94.99/94.35  clauses in usable.
% 94.99/94.35  
% 94.99/94.35     dependent: set(knuth_bendix).
% 94.99/94.35     dependent: set(anl_eq).
% 94.99/94.35     dependent: set(para_from).
% 94.99/94.35     dependent: set(para_into).
% 94.99/94.35     dependent: clear(para_from_right).
% 94.99/94.35     dependent: clear(para_into_right).
% 94.99/94.35     dependent: set(para_from_vars).
% 94.99/94.35     dependent: set(eq_units_both_ways).
% 94.99/94.35     dependent: set(dynamic_demod_all).
% 94.99/94.35     dependent: set(dynamic_demod).
% 94.99/94.35     dependent: set(order_eq).
% 94.99/94.35     dependent: set(back_demod).
% 94.99/94.35     dependent: set(lrpo).
% 94.99/94.35     dependent: set(hyper_res).
% 94.99/94.35     dependent: set(unit_deletion).
% 94.99/94.35     dependent: set(factor).
% 94.99/94.35  
% 94.99/94.35  ------------> process usable:
% 94.99/94.35  ** KEPT (pick-wt=5): 1 [] -nextState(A,B)|reachableState(A).
% 94.99/94.35  ** KEPT (pick-wt=5): 2 [] -nextState(A,B)|reachableState(B).
% 94.99/94.35  ** KEPT (pick-wt=65): 3 [] -reachableState(A)|constB0=A|constB1=A|constB2=A|constB3=A|constB4=A|constB5=A|constB6=A|constB7=A|constB8=A|constB9=A|constB10=A|constB11=A|constB12=A|constB13=A|constB14=A|constB15=A|constB16=A|constB17=A|constB18=A|constB19=A|constB20=A.
% 94.99/94.35  ** KEPT (pick-wt=7): 4 [] -nextState(A,B)| -v1(A)| -v1(B).
% 94.99/94.35  ** KEPT (pick-wt=7): 5 [] -nextState(A,B)|v1(A)|v1(B).
% 94.99/94.35  ** KEPT (pick-wt=2): 6 [] -v1(constB0).
% 94.99/94.35  ** KEPT (pick-wt=6): 7 [] -addressVal(v1019_range_3_to_0_address_term_bound_20,A)|v1019(constB20,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 8 [] addressVal(v1019_range_3_to_0_address_term_bound_20,A)| -v1019(constB20,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 9 [] -addressVal(v1019_range_3_to_0_address_term_bound_19,A)|v1019(constB19,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 10 [] addressVal(v1019_range_3_to_0_address_term_bound_19,A)| -v1019(constB19,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 11 [] -addressVal(v1019_range_3_to_0_address_term_bound_18,A)|v1019(constB18,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 12 [] addressVal(v1019_range_3_to_0_address_term_bound_18,A)| -v1019(constB18,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 13 [] -addressVal(v1019_range_3_to_0_address_term_bound_17,A)|v1019(constB17,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 14 [] addressVal(v1019_range_3_to_0_address_term_bound_17,A)| -v1019(constB17,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 15 [] -addressVal(v1019_range_3_to_0_address_term_bound_16,A)|v1019(constB16,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 16 [] addressVal(v1019_range_3_to_0_address_term_bound_16,A)| -v1019(constB16,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 17 [] -addressVal(v1019_range_3_to_0_address_term_bound_15,A)|v1019(constB15,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 18 [] addressVal(v1019_range_3_to_0_address_term_bound_15,A)| -v1019(constB15,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 19 [] -addressVal(v1019_range_3_to_0_address_term_bound_14,A)|v1019(constB14,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 20 [] addressVal(v1019_range_3_to_0_address_term_bound_14,A)| -v1019(constB14,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 21 [] -addressVal(v1019_range_3_to_0_address_term_bound_13,A)|v1019(constB13,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 22 [] addressVal(v1019_range_3_to_0_address_term_bound_13,A)| -v1019(constB13,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 23 [] -addressVal(v1019_range_3_to_0_address_term_bound_12,A)|v1019(constB12,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 24 [] addressVal(v1019_range_3_to_0_address_term_bound_12,A)| -v1019(constB12,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 25 [] -addressVal(v1019_range_3_to_0_address_term_bound_11,A)|v1019(constB11,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 26 [] addressVal(v1019_range_3_to_0_address_term_bound_11,A)| -v1019(constB11,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 27 [] -addressVal(v1019_range_3_to_0_address_term_bound_10,A)|v1019(constB10,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 28 [] addressVal(v1019_range_3_to_0_address_term_bound_10,A)| -v1019(constB10,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 29 [] -addressVal(v1019_range_3_to_0_address_term_bound_9,A)|v1019(constB9,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 30 [] addressVal(v1019_range_3_to_0_address_term_bound_9,A)| -v1019(constB9,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 31 [] -addressVal(v1019_range_3_to_0_address_term_bound_8,A)|v1019(constB8,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 32 [] addressVal(v1019_range_3_to_0_address_term_bound_8,A)| -v1019(constB8,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 33 [] -addressVal(v1019_range_3_to_0_address_term_bound_7,A)|v1019(constB7,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 34 [] addressVal(v1019_range_3_to_0_address_term_bound_7,A)| -v1019(constB7,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 35 [] -addressVal(v1019_range_3_to_0_address_term_bound_6,A)|v1019(constB6,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 36 [] addressVal(v1019_range_3_to_0_address_term_bound_6,A)| -v1019(constB6,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 37 [] -addressVal(v1019_range_3_to_0_address_term_bound_5,A)|v1019(constB5,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 38 [] addressVal(v1019_range_3_to_0_address_term_bound_5,A)| -v1019(constB5,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 39 [] -addressVal(v1019_range_3_to_0_address_term_bound_4,A)|v1019(constB4,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 40 [] addressVal(v1019_range_3_to_0_address_term_bound_4,A)| -v1019(constB4,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 41 [] -addressVal(v1019_range_3_to_0_address_term_bound_3,A)|v1019(constB3,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 42 [] addressVal(v1019_range_3_to_0_address_term_bound_3,A)| -v1019(constB3,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 43 [] -addressVal(v1019_range_3_to_0_address_term_bound_2,A)|v1019(constB2,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 44 [] addressVal(v1019_range_3_to_0_address_term_bound_2,A)| -v1019(constB2,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 45 [] -addressVal(v1019_range_3_to_0_address_term_bound_1,A)|v1019(constB1,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 46 [] addressVal(v1019_range_3_to_0_address_term_bound_1,A)| -v1019(constB1,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 47 [] -addressVal(v1019_range_3_to_0_address_term_bound_0,A)|v1019(constB0,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 48 [] addressVal(v1019_range_3_to_0_address_term_bound_0,A)| -v1019(constB0,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 49 [] -addressVal(v953_range_3_to_0_address_term_bound_20,A)|v953(constB20,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 50 [] addressVal(v953_range_3_to_0_address_term_bound_20,A)| -v953(constB20,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 51 [] -addressVal(v953_range_3_to_0_address_term_bound_19,A)|v953(constB19,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 52 [] addressVal(v953_range_3_to_0_address_term_bound_19,A)| -v953(constB19,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 53 [] -addressVal(v953_range_3_to_0_address_term_bound_18,A)|v953(constB18,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 54 [] addressVal(v953_range_3_to_0_address_term_bound_18,A)| -v953(constB18,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 55 [] -addressVal(v953_range_3_to_0_address_term_bound_17,A)|v953(constB17,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 56 [] addressVal(v953_range_3_to_0_address_term_bound_17,A)| -v953(constB17,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 57 [] -addressVal(v953_range_3_to_0_address_term_bound_16,A)|v953(constB16,A).
% 94.99/94.35  ** KEPT (pick-wt=6): 58 [] addressVal(v953_range_3_to_0_address_term_bound_16,A)| -v953(constB16,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 59 [] -addressVal(v953_range_3_to_0_address_term_bound_15,A)|v953(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 60 [] addressVal(v953_range_3_to_0_address_term_bound_15,A)| -v953(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 61 [] -addressVal(v953_range_3_to_0_address_term_bound_14,A)|v953(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 62 [] addressVal(v953_range_3_to_0_address_term_bound_14,A)| -v953(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 63 [] -addressVal(v953_range_3_to_0_address_term_bound_13,A)|v953(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 64 [] addressVal(v953_range_3_to_0_address_term_bound_13,A)| -v953(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 65 [] -addressVal(v953_range_3_to_0_address_term_bound_12,A)|v953(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 66 [] addressVal(v953_range_3_to_0_address_term_bound_12,A)| -v953(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 67 [] -addressVal(v953_range_3_to_0_address_term_bound_11,A)|v953(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 68 [] addressVal(v953_range_3_to_0_address_term_bound_11,A)| -v953(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 69 [] -addressVal(v953_range_3_to_0_address_term_bound_10,A)|v953(constB10,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 70 [] addressVal(v953_range_3_to_0_address_term_bound_10,A)| -v953(constB10,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 71 [] -addressVal(v953_range_3_to_0_address_term_bound_9,A)|v953(constB9,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 72 [] addressVal(v953_range_3_to_0_address_term_bound_9,A)| -v953(constB9,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 73 [] -addressVal(v953_range_3_to_0_address_term_bound_8,A)|v953(constB8,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 74 [] addressVal(v953_range_3_to_0_address_term_bound_8,A)| -v953(constB8,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 75 [] -addressVal(v953_range_3_to_0_address_term_bound_7,A)|v953(constB7,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 76 [] addressVal(v953_range_3_to_0_address_term_bound_7,A)| -v953(constB7,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 77 [] -addressVal(v953_range_3_to_0_address_term_bound_6,A)|v953(constB6,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 78 [] addressVal(v953_range_3_to_0_address_term_bound_6,A)| -v953(constB6,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 79 [] -addressVal(v953_range_3_to_0_address_term_bound_5,A)|v953(constB5,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 80 [] addressVal(v953_range_3_to_0_address_term_bound_5,A)| -v953(constB5,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 81 [] -addressVal(v953_range_3_to_0_address_term_bound_4,A)|v953(constB4,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 82 [] addressVal(v953_range_3_to_0_address_term_bound_4,A)| -v953(constB4,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 83 [] -addressVal(v953_range_3_to_0_address_term_bound_3,A)|v953(constB3,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 84 [] addressVal(v953_range_3_to_0_address_term_bound_3,A)| -v953(constB3,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 85 [] -addressVal(v953_range_3_to_0_address_term_bound_2,A)|v953(constB2,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 86 [] addressVal(v953_range_3_to_0_address_term_bound_2,A)| -v953(constB2,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 87 [] -addressVal(v953_range_3_to_0_address_term_bound_1,A)|v953(constB1,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 88 [] addressVal(v953_range_3_to_0_address_term_bound_1,A)| -v953(constB1,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 89 [] -addressVal(v953_range_3_to_0_address_term_bound_0,A)|v953(constB0,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 90 [] addressVal(v953_range_3_to_0_address_term_bound_0,A)| -v953(constB0,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 91 [] -addressVal(v869_range_3_to_0_address_term_bound_20,A)|v869(constB20,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 92 [] addressVal(v869_range_3_to_0_address_term_bound_20,A)| -v869(constB20,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 93 [] -addressVal(v869_range_3_to_0_address_term_bound_19,A)|v869(constB19,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 94 [] addressVal(v869_range_3_to_0_address_term_bound_19,A)| -v869(constB19,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 95 [] -addressVal(v869_range_3_to_0_address_term_bound_18,A)|v869(constB18,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 96 [] addressVal(v869_range_3_to_0_address_term_bound_18,A)| -v869(constB18,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 97 [] -addressVal(v869_range_3_to_0_address_term_bound_17,A)|v869(constB17,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 98 [] addressVal(v869_range_3_to_0_address_term_bound_17,A)| -v869(constB17,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 99 [] -addressVal(v869_range_3_to_0_address_term_bound_16,A)|v869(constB16,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 100 [] addressVal(v869_range_3_to_0_address_term_bound_16,A)| -v869(constB16,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 101 [] -addressVal(v869_range_3_to_0_address_term_bound_15,A)|v869(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 102 [] addressVal(v869_range_3_to_0_address_term_bound_15,A)| -v869(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 103 [] -addressVal(v869_range_3_to_0_address_term_bound_14,A)|v869(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 104 [] addressVal(v869_range_3_to_0_address_term_bound_14,A)| -v869(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 105 [] -addressVal(v869_range_3_to_0_address_term_bound_13,A)|v869(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 106 [] addressVal(v869_range_3_to_0_address_term_bound_13,A)| -v869(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 107 [] -addressVal(v869_range_3_to_0_address_term_bound_12,A)|v869(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 108 [] addressVal(v869_range_3_to_0_address_term_bound_12,A)| -v869(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 109 [] -addressVal(v869_range_3_to_0_address_term_bound_11,A)|v869(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 110 [] addressVal(v869_range_3_to_0_address_term_bound_11,A)| -v869(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 111 [] -addressVal(v869_range_3_to_0_address_term_bound_10,A)|v869(constB10,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 112 [] addressVal(v869_range_3_to_0_address_term_bound_10,A)| -v869(constB10,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 113 [] -addressVal(v869_range_3_to_0_address_term_bound_9,A)|v869(constB9,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 114 [] addressVal(v869_range_3_to_0_address_term_bound_9,A)| -v869(constB9,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 115 [] -addressVal(v869_range_3_to_0_address_term_bound_8,A)|v869(constB8,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 116 [] addressVal(v869_range_3_to_0_address_term_bound_8,A)| -v869(constB8,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 117 [] -addressVal(v869_range_3_to_0_address_term_bound_7,A)|v869(constB7,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 118 [] addressVal(v869_range_3_to_0_address_term_bound_7,A)| -v869(constB7,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 119 [] -addressVal(v869_range_3_to_0_address_term_bound_6,A)|v869(constB6,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 120 [] addressVal(v869_range_3_to_0_address_term_bound_6,A)| -v869(constB6,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 121 [] -addressVal(v869_range_3_to_0_address_term_bound_5,A)|v869(constB5,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 122 [] addressVal(v869_range_3_to_0_address_term_bound_5,A)| -v869(constB5,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 123 [] -addressVal(v869_range_3_to_0_address_term_bound_4,A)|v869(constB4,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 124 [] addressVal(v869_range_3_to_0_address_term_bound_4,A)| -v869(constB4,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 125 [] -addressVal(v869_range_3_to_0_address_term_bound_3,A)|v869(constB3,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 126 [] addressVal(v869_range_3_to_0_address_term_bound_3,A)| -v869(constB3,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 127 [] -addressVal(v869_range_3_to_0_address_term_bound_2,A)|v869(constB2,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 128 [] addressVal(v869_range_3_to_0_address_term_bound_2,A)| -v869(constB2,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 129 [] -addressVal(v869_range_3_to_0_address_term_bound_1,A)|v869(constB1,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 130 [] addressVal(v869_range_3_to_0_address_term_bound_1,A)| -v869(constB1,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 131 [] -addressVal(v869_range_3_to_0_address_term_bound_0,A)|v869(constB0,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 132 [] addressVal(v869_range_3_to_0_address_term_bound_0,A)| -v869(constB0,A).
% 94.99/94.36  ** KEPT (pick-wt=5): 133 [] -addressVal(b1110_address_term,A)|b1110(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 134 [] addressVal(b1110_address_term,A)| -b1110(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 135 [] -addressVal(b1101_address_term,A)|b1101(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 136 [] addressVal(b1101_address_term,A)| -b1101(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 137 [] -addressVal(b1100_address_term,A)|b1100(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 138 [] addressVal(b1100_address_term,A)| -b1100(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 139 [] -addressVal(b1011_address_term,A)|b1011(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 140 [] addressVal(b1011_address_term,A)| -b1011(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 141 [] -addressVal(b1010_address_term,A)|b1010(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 142 [] addressVal(b1010_address_term,A)| -b1010(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 143 [] -addressVal(b1001_address_term,A)|b1001(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 144 [] addressVal(b1001_address_term,A)| -b1001(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 145 [] -addressVal(b1000_address_term,A)|b1000(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 146 [] addressVal(b1000_address_term,A)| -b1000(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 147 [] -addressVal(b0111_address_term,A)|b0111(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 148 [] addressVal(b0111_address_term,A)| -b0111(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 149 [] -addressVal(b0100_address_term,A)|b0100(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 150 [] addressVal(b0100_address_term,A)| -b0100(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 151 [] -addressVal(b0011_address_term,A)|b0011(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 152 [] addressVal(b0011_address_term,A)| -b0011(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 153 [] -addressVal(b0010_address_term,A)|b0010(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 154 [] addressVal(b0010_address_term,A)| -b0010(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 155 [] -addressVal(b1111_address_term,A)|b1111(A).
% 94.99/94.36  ** KEPT (pick-wt=5): 156 [] addressVal(b1111_address_term,A)| -b1111(A).
% 94.99/94.36  ** KEPT (pick-wt=6): 157 [] -addressVal(v791_range_3_to_0_address_term_bound_20,A)|v791(constB20,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 158 [] addressVal(v791_range_3_to_0_address_term_bound_20,A)| -v791(constB20,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 159 [] -addressVal(v791_range_3_to_0_address_term_bound_19,A)|v791(constB19,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 160 [] addressVal(v791_range_3_to_0_address_term_bound_19,A)| -v791(constB19,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 161 [] -addressVal(v791_range_3_to_0_address_term_bound_18,A)|v791(constB18,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 162 [] addressVal(v791_range_3_to_0_address_term_bound_18,A)| -v791(constB18,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 163 [] -addressVal(v791_range_3_to_0_address_term_bound_17,A)|v791(constB17,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 164 [] addressVal(v791_range_3_to_0_address_term_bound_17,A)| -v791(constB17,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 165 [] -addressVal(v791_range_3_to_0_address_term_bound_16,A)|v791(constB16,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 166 [] addressVal(v791_range_3_to_0_address_term_bound_16,A)| -v791(constB16,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 167 [] -addressVal(v791_range_3_to_0_address_term_bound_15,A)|v791(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 168 [] addressVal(v791_range_3_to_0_address_term_bound_15,A)| -v791(constB15,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 169 [] -addressVal(v791_range_3_to_0_address_term_bound_14,A)|v791(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 170 [] addressVal(v791_range_3_to_0_address_term_bound_14,A)| -v791(constB14,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 171 [] -addressVal(v791_range_3_to_0_address_term_bound_13,A)|v791(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 172 [] addressVal(v791_range_3_to_0_address_term_bound_13,A)| -v791(constB13,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 173 [] -addressVal(v791_range_3_to_0_address_term_bound_12,A)|v791(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 174 [] addressVal(v791_range_3_to_0_address_term_bound_12,A)| -v791(constB12,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 175 [] -addressVal(v791_range_3_to_0_address_term_bound_11,A)|v791(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 176 [] addressVal(v791_range_3_to_0_address_term_bound_11,A)| -v791(constB11,A).
% 94.99/94.36  ** KEPT (pick-wt=6): 177 [] -addressVal(v791_range_3_to_0_address_term_bound_10,A)|v791(constB10,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 178 [] addressVal(v791_range_3_to_0_address_term_bound_10,A)| -v791(constB10,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 179 [] -addressVal(v791_range_3_to_0_address_term_bound_9,A)|v791(constB9,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 180 [] addressVal(v791_range_3_to_0_address_term_bound_9,A)| -v791(constB9,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 181 [] -addressVal(v791_range_3_to_0_address_term_bound_8,A)|v791(constB8,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 182 [] addressVal(v791_range_3_to_0_address_term_bound_8,A)| -v791(constB8,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 183 [] -addressVal(v791_range_3_to_0_address_term_bound_7,A)|v791(constB7,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 184 [] addressVal(v791_range_3_to_0_address_term_bound_7,A)| -v791(constB7,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 185 [] -addressVal(v791_range_3_to_0_address_term_bound_6,A)|v791(constB6,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 186 [] addressVal(v791_range_3_to_0_address_term_bound_6,A)| -v791(constB6,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 187 [] -addressVal(v791_range_3_to_0_address_term_bound_5,A)|v791(constB5,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 188 [] addressVal(v791_range_3_to_0_address_term_bound_5,A)| -v791(constB5,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 189 [] -addressVal(v791_range_3_to_0_address_term_bound_4,A)|v791(constB4,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 190 [] addressVal(v791_range_3_to_0_address_term_bound_4,A)| -v791(constB4,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 191 [] -addressVal(v791_range_3_to_0_address_term_bound_3,A)|v791(constB3,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 192 [] addressVal(v791_range_3_to_0_address_term_bound_3,A)| -v791(constB3,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 193 [] -addressVal(v791_range_3_to_0_address_term_bound_2,A)|v791(constB2,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 194 [] addressVal(v791_range_3_to_0_address_term_bound_2,A)| -v791(constB2,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 195 [] -addressVal(v791_range_3_to_0_address_term_bound_1,A)|v791(constB1,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 196 [] addressVal(v791_range_3_to_0_address_term_bound_1,A)| -v791(constB1,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 197 [] -addressVal(v791_range_3_to_0_address_term_bound_0,A)|v791(constB0,A).
% 95.04/94.37  ** KEPT (pick-wt=6): 198 [] addressVal(v791_range_3_to_0_address_term_bound_0,A)| -v791(constB0,A).
% 95.04/94.37  ** KEPT (pick-wt=5): 199 [] -addressVal(b0101_address_term,A)|b0101(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 200 [] addressVal(b0101_address_term,A)| -b0101(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 201 [] -addressVal(b0001_address_term,A)|b0001(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 202 [] addressVal(b0001_address_term,A)| -b0001(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 203 [] -addressVal(b0110_address_term,A)|b0110(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 204 [] addressVal(b0110_address_term,A)| -b0110(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 205 [] -addressVal(b0000_address_term,A)|b0000(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 206 [] addressVal(b0000_address_term,A)| -b0000(A).
% 95.04/94.37  ** KEPT (pick-wt=17): 207 [] -address(A)| -address(B)| -addressDiff(A,B,C)|A=B| -addressVal(A,C)| -addressVal(B,C).
% 95.04/94.37  ** KEPT (pick-wt=17): 208 [] -address(A)| -address(B)| -addressDiff(A,B,C)|A=B|addressVal(A,C)|addressVal(B,C).
% 95.04/94.37  ** KEPT (pick-wt=2): 209 [] -v4($c1).
% 95.04/94.37  ** KEPT (pick-wt=4): 210 [] -v4(A)| -v3674(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 211 [] -v3674(A)| -v3675(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 212 [] -v3675(A)|v3677(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 213 [] -v3675(A)|v3693(A).
% 95.04/94.37  ** KEPT (pick-wt=6): 214 [] v3675(A)| -v3677(A)| -v3693(A).
% 95.04/94.37  ** KEPT (pick-wt=8): 215 [] -v3693(A)|v3679(A,bitIndex0)|v3679(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=5): 216 [] v3693(A)| -v3679(A,bitIndex0).
% 95.04/94.37  ** KEPT (pick-wt=5): 217 [] v3693(A)| -v3679(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=4): 218 [] -v3677(A)| -v3678(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 219 [] -v3678(A)|v3679(A,bitIndex0).
% 95.04/94.37  ** KEPT (pick-wt=5): 220 [] -v3678(A)|v3679(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=8): 221 [] v3678(A)| -v3679(A,bitIndex0)| -v3679(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=5): 222 [] -v3679(A,bitIndex0)|v3680(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 223 [] v3679(A,bitIndex0)| -v3680(A).
% 95.04/94.37  ** KEPT (pick-wt=3): 225 [copy,224,propositional] v3679(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=4): 226 [] -v3680(A)|v3682(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 227 [] -v3680(A)|v3684(A,bitIndex5).
% 95.04/94.37  ** KEPT (pick-wt=7): 228 [] v3680(A)| -v3682(A)| -v3684(A,bitIndex5).
% 95.04/94.37  ** KEPT (pick-wt=4): 229 [] -v3682(A)|v3683(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 230 [] -v3682(A)|v3684(A,bitIndex4).
% 95.04/94.37  ** KEPT (pick-wt=7): 231 [] v3682(A)| -v3683(A)| -v3684(A,bitIndex4).
% 95.04/94.37  ** KEPT (pick-wt=7): 232 [] -v3683(A)|v3684(A,bitIndex3)|v3685(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 233 [] v3683(A)| -v3684(A,bitIndex3).
% 95.04/94.37  ** KEPT (pick-wt=4): 234 [] v3683(A)| -v3685(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 235 [] -v3685(A)|v3686(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 236 [] -v3685(A)|v3692(A).
% 95.04/94.37  ** KEPT (pick-wt=6): 237 [] v3685(A)| -v3686(A)| -v3692(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 238 [] -v3692(A)| -v3684(A,bitIndex3).
% 95.04/94.37  ** KEPT (pick-wt=7): 239 [] -v3686(A)|v3684(A,bitIndex2)|v3687(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 240 [] v3686(A)| -v3684(A,bitIndex2).
% 95.04/94.37  ** KEPT (pick-wt=4): 241 [] v3686(A)| -v3687(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 242 [] -v3687(A)|v3688(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 243 [] -v3687(A)|v3691(A).
% 95.04/94.37  ** KEPT (pick-wt=6): 244 [] v3687(A)| -v3688(A)| -v3691(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 245 [] -v3691(A)| -v3684(A,bitIndex2).
% 95.04/94.37  ** KEPT (pick-wt=7): 246 [] -v3688(A)|v3684(A,bitIndex1)|v3689(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 247 [] v3688(A)| -v3684(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=4): 248 [] v3688(A)| -v3689(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 249 [] -v3689(A)|v3684(A,bitIndex0).
% 95.04/94.37  ** KEPT (pick-wt=4): 250 [] -v3689(A)|v3690(A).
% 95.04/94.37  ** KEPT (pick-wt=7): 251 [] v3689(A)| -v3684(A,bitIndex0)| -v3690(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 252 [] -v3690(A)| -v3684(A,bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=3): 253 [] -v3684(A,bitIndex3).
% 95.04/94.37  ** KEPT (pick-wt=3): 254 [] -v3684(A,bitIndex4).
% 95.04/94.37  ** KEPT (pick-wt=3): 255 [] -v3684(A,bitIndex5).
% 95.04/94.37  ** KEPT (pick-wt=8): 256 [] -range_2_0(A)| -v3684(B,A)|v8(B,A).
% 95.04/94.37  ** KEPT (pick-wt=8): 257 [] -range_2_0(A)|v3684(B,A)| -v8(B,A).
% 95.04/94.37  ** KEPT (pick-wt=13): 258 [] -nextState(A,B)|v3660(B)| -range_2_0(C)| -v8(B,C)|v8(A,C).
% 95.04/94.37  ** KEPT (pick-wt=13): 259 [] -nextState(A,B)|v3660(B)| -range_2_0(C)|v8(B,C)| -v8(A,C).
% 95.04/94.37  ** KEPT (pick-wt=10): 260 [] -v3660(A)| -range_2_0(B)| -v8(A,B)|v3668(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 261 [] -v3660(A)| -range_2_0(B)|v8(A,B)| -v3668(A,B).
% 95.04/94.37  ** KEPT (pick-wt=11): 262 [] -nextState(A,B)| -range_2_0(C)| -v3668(B,C)|v3666(A,C).
% 95.04/94.37  ** KEPT (pick-wt=11): 263 [] -nextState(A,B)| -range_2_0(C)|v3668(B,C)| -v3666(A,C).
% 95.04/94.37  ** KEPT (pick-wt=10): 264 [] v3669(A)| -range_2_0(B)| -v3666(A,B)|v21(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 265 [] v3669(A)| -range_2_0(B)|v3666(A,B)| -v21(A,B).
% 95.04/94.37  ** KEPT (pick-wt=7): 267 [copy,266,propositional] -v3669(A)| -range_2_0(B)| -v3666(A,B).
% 95.04/94.37  ** KEPT (pick-wt=4): 268 [] -v3669(A)| -v10(A).
% 95.04/94.37  ** KEPT (pick-wt=7): 269 [] -nextState(A,B)| -v3660(B)|v3661(B).
% 95.04/94.37  ** KEPT (pick-wt=7): 270 [] -nextState(A,B)|v3660(B)| -v3661(B).
% 95.04/94.37  ** KEPT (pick-wt=7): 271 [] -nextState(A,B)| -v3661(B)|v3662(B).
% 95.04/94.37  ** KEPT (pick-wt=7): 272 [] -nextState(A,B)| -v3661(B)|v286(B).
% 95.04/94.37  ** KEPT (pick-wt=9): 273 [] -nextState(A,B)|v3661(B)| -v3662(B)| -v286(B).
% 95.04/94.37  ** KEPT (pick-wt=7): 274 [] -nextState(A,B)|v3662(B)|v295(B).
% 95.04/94.37  ** KEPT (pick-wt=7): 275 [] -nextState(A,B)| -v3662(B)| -v295(B).
% 95.04/94.37  ** KEPT (pick-wt=10): 276 [] v23(A)| -range_2_0(B)| -v21(A,B)|v8(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 277 [] v23(A)| -range_2_0(B)|v21(A,B)| -v8(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 278 [] -v23(A)| -range_2_0(B)| -v21(A,B)|v3643(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 279 [] -v23(A)| -range_2_0(B)|v21(A,B)| -v3643(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 280 [] v3644(A)| -range_2_0(B)| -v3643(A,B)|v3645(A,B).
% 95.04/94.37  ** KEPT (pick-wt=10): 281 [] v3644(A)| -range_2_0(B)|v3643(A,B)| -v3645(A,B).
% 95.04/94.37  ** KEPT (pick-wt=7): 283 [copy,282,propositional] -v3644(A)| -range_2_0(B)| -v3643(A,B).
% 95.04/94.37  ** KEPT (pick-wt=2): 284 [] -b000(bitIndex2).
% 95.04/94.37  ** KEPT (pick-wt=2): 285 [] -b000(bitIndex1).
% 95.04/94.37  ** KEPT (pick-wt=2): 286 [] -b000(bitIndex0).
% 95.04/94.37  ** KEPT (pick-wt=5): 287 [] -v3645(A,bitIndex0)|v3655(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 288 [] v3645(A,bitIndex0)| -v3655(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 289 [] -v3645(A,bitIndex1)|v3653(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 290 [] v3645(A,bitIndex1)| -v3653(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 291 [] -v3645(A,bitIndex2)|v3647(A).
% 95.04/94.37  ** KEPT (pick-wt=5): 292 [] v3645(A,bitIndex2)| -v3647(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 293 [] -v3653(A)|v3654(A).
% 95.04/94.37  ** KEPT (pick-wt=4): 294 [] -v3653(A)|v3657(A).
% 95.04/94.37  ** KEPT (pick-wt=6): 295 [] v3653(A)| -v3654(A)| -v3657(A).
% 95.04/94.37  ** KEPT (pick-wt=8): 296 [] -v3657(A)|v8(A,bitIndex0)|v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=5): 297 [] v3657(A)| -v8(A,bitIndex0).
% 95.04/94.38  ** KEPT (pick-wt=5): 298 [] v3657(A)| -v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=6): 299 [] -v3654(A)|v3655(A)|v3656(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 300 [] v3654(A)| -v3655(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 301 [] v3654(A)| -v3656(A).
% 95.04/94.38  ** KEPT (pick-wt=5): 302 [] -v3656(A)| -v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=5): 303 [] -v3655(A)| -v8(A,bitIndex0).
% 95.04/94.38  ** KEPT (pick-wt=4): 304 [] -v3647(A)|v3648(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 305 [] -v3647(A)|v3652(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 306 [] v3647(A)| -v3648(A)| -v3652(A).
% 95.04/94.38  ** KEPT (pick-wt=7): 307 [] -v3652(A)|v3650(A)|v8(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=4): 308 [] v3652(A)| -v3650(A).
% 95.04/94.38  ** KEPT (pick-wt=5): 309 [] v3652(A)| -v8(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=6): 310 [] -v3648(A)|v3649(A)|v3651(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 311 [] v3648(A)| -v3649(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 312 [] v3648(A)| -v3651(A).
% 95.04/94.38  ** KEPT (pick-wt=5): 313 [] -v3651(A)| -v8(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=4): 314 [] -v3649(A)| -v3650(A).
% 95.04/94.38  ** KEPT (pick-wt=5): 315 [] -v3650(A)|v8(A,bitIndex0).
% 95.04/94.38  ** KEPT (pick-wt=5): 316 [] -v3650(A)|v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=8): 317 [] v3650(A)| -v8(A,bitIndex0)| -v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=5): 319 [copy,318,propositional] -v3644(A)|v8(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=5): 321 [copy,320,propositional] -v3644(A)| -v8(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=5): 323 [copy,322,propositional] -v3644(A)|v8(A,bitIndex0).
% 95.04/94.38  ** KEPT (pick-wt=11): 325 [copy,324,propositional] v3644(A)| -v8(A,bitIndex2)|v8(A,bitIndex1)| -v8(A,bitIndex0).
% 95.04/94.38  ** KEPT (pick-wt=2): 326 [] -b101(bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=4): 327 [] -v23(A)|v25(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 328 [] v23(A)| -v25(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 329 [] -v25(A)|v27(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 330 [] v25(A)| -v27(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 331 [] -v27(A)|v29(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 332 [] v27(A)| -v29(A).
% 95.04/94.38  ** KEPT (pick-wt=5): 333 [] -v29(A)|v31(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=5): 334 [] v29(A)| -v31(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=6): 335 [] -v31(A,bitIndex7)|v3635(A,bitIndex6).
% 95.04/94.38  ** KEPT (pick-wt=6): 336 [] v31(A,bitIndex7)| -v3635(A,bitIndex6).
% 95.04/94.38  ** KEPT (pick-wt=11): 337 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex10)|v31(A,bitIndex11).
% 95.04/94.38  ** KEPT (pick-wt=11): 338 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex10)| -v31(A,bitIndex11).
% 95.04/94.38  ** KEPT (pick-wt=11): 339 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex9)|v31(A,bitIndex10).
% 95.04/94.38  ** KEPT (pick-wt=11): 340 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex9)| -v31(A,bitIndex10).
% 95.04/94.38  ** KEPT (pick-wt=11): 341 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex8)|v31(A,bitIndex9).
% 95.04/94.38  ** KEPT (pick-wt=11): 342 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex8)| -v31(A,bitIndex9).
% 95.04/94.38  ** KEPT (pick-wt=11): 343 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex7)|v31(A,bitIndex8).
% 95.04/94.38  ** KEPT (pick-wt=11): 344 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex7)| -v31(A,bitIndex8).
% 95.04/94.38  ** KEPT (pick-wt=11): 345 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex6)|v31(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=11): 346 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex6)| -v31(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=11): 347 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex5)|v31(A,bitIndex6).
% 95.04/94.38  ** KEPT (pick-wt=11): 348 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex5)| -v31(A,bitIndex6).
% 95.04/94.38  ** KEPT (pick-wt=11): 349 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex4)|v31(A,bitIndex5).
% 95.04/94.38  ** KEPT (pick-wt=11): 350 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex4)| -v31(A,bitIndex5).
% 95.04/94.38  ** KEPT (pick-wt=11): 351 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex3)|v31(A,bitIndex4).
% 95.04/94.38  ** KEPT (pick-wt=11): 352 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex3)| -v31(A,bitIndex4).
% 95.04/94.38  ** KEPT (pick-wt=11): 353 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex2)|v31(A,bitIndex3).
% 95.04/94.38  ** KEPT (pick-wt=11): 354 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex2)| -v31(A,bitIndex3).
% 95.04/94.38  ** KEPT (pick-wt=11): 355 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex1)|v31(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=11): 356 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex1)| -v31(A,bitIndex2).
% 95.04/94.38  ** KEPT (pick-wt=11): 357 [] -nextState(A,B)|v3636(B)| -v3635(B,bitIndex0)|v31(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=11): 358 [] -nextState(A,B)|v3636(B)|v3635(B,bitIndex0)| -v31(A,bitIndex1).
% 95.04/94.38  ** KEPT (pick-wt=10): 359 [] -v3636(A)| -range_10_0(B)| -v3635(A,B)|v1253(A,B).
% 95.04/94.38  ** KEPT (pick-wt=10): 360 [] -v3636(A)| -range_10_0(B)|v3635(A,B)| -v1253(A,B).
% 95.04/94.38  ** KEPT (pick-wt=7): 361 [] -nextState(A,B)| -v3636(B)|v3637(B).
% 95.04/94.38  ** KEPT (pick-wt=7): 362 [] -nextState(A,B)|v3636(B)| -v3637(B).
% 95.04/94.38  ** KEPT (pick-wt=7): 363 [] -nextState(A,B)| -v3637(B)|v3639(B).
% 95.04/94.38  ** KEPT (pick-wt=7): 364 [] -nextState(A,B)| -v3637(B)|v1240(B).
% 95.04/94.38  ** KEPT (pick-wt=9): 365 [] -nextState(A,B)|v3637(B)| -v3639(B)| -v1240(B).
% 95.04/94.38  ** KEPT (pick-wt=7): 366 [] -nextState(A,B)|v3639(B)|v1247(B).
% 95.04/94.38  ** KEPT (pick-wt=7): 367 [] -nextState(A,B)| -v3639(B)| -v1247(B).
% 95.04/94.38  ** KEPT (pick-wt=5): 369 [copy,368,propositional] v3611(A)| -v36(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=5): 371 [copy,370,propositional] -v3611(A)|v36(A,bitIndex7).
% 95.04/94.38  ** KEPT (pick-wt=6): 372 [] -v3611(A)|v3612(A)|v3632(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 373 [] v3611(A)| -v3612(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 374 [] v3611(A)| -v3632(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 375 [] -v3632(A)|v3633(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 376 [] -v3632(A)|v1323(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 377 [] v3632(A)| -v3633(A)| -v1323(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 378 [] -v3633(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 379 [] v3633(A)| -v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 380 [] -v3612(A)|v3613(A)|v3630(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 381 [] v3612(A)| -v3613(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 382 [] v3612(A)| -v3630(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 383 [] -v3630(A)|v3631(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 384 [] -v3630(A)|v1300(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 385 [] v3630(A)| -v3631(A)| -v1300(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 386 [] -v3631(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 387 [] -v3631(A)|v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 388 [] v3631(A)| -v3619(A)| -v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 389 [] -v3613(A)|v3614(A)|v3628(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 390 [] v3613(A)| -v3614(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 391 [] v3613(A)| -v3628(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 392 [] -v3628(A)|v3629(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 393 [] -v3628(A)|v1360(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 394 [] v3628(A)| -v3629(A)| -v1360(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 395 [] -v3629(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 396 [] v3629(A)| -v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 397 [] -v3614(A)|v3615(A)|v3626(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 398 [] v3614(A)| -v3615(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 399 [] v3614(A)| -v3626(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 400 [] -v3626(A)|v3627(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 401 [] -v3626(A)|v1278(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 402 [] v3626(A)| -v3627(A)| -v1278(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 403 [] -v3627(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 404 [] -v3627(A)|v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 405 [] v3627(A)| -v3619(A)| -v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 406 [] -v3615(A)|v3616(A)|v3624(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 407 [] v3615(A)| -v3616(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 408 [] v3615(A)| -v3624(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 409 [] -v3624(A)|v3625(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 410 [] -v3624(A)|v1355(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 411 [] v3624(A)| -v3625(A)| -v1355(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 412 [] -v3625(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 413 [] v3625(A)| -v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 414 [] -v3616(A)|v3617(A)|v3621(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 415 [] v3616(A)| -v3617(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 416 [] v3616(A)| -v3621(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 417 [] -v3621(A)|v3622(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 418 [] -v3621(A)|v1238(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 419 [] v3621(A)| -v3622(A)| -v1238(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 420 [] -v3622(A)|v3619(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 421 [] -v3622(A)|v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 422 [] v3622(A)| -v3619(A)| -v1180(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 423 [] -v3619(A)|v3620(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 424 [] -v3619(A)|v1347(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 425 [] v3619(A)| -v3620(A)| -v1347(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 426 [] -v3617(A)|v3618(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 427 [] -v3617(A)|v1348(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 428 [] v3617(A)| -v3618(A)| -v1348(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 429 [] -v3618(A)|v3620(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 430 [] -v3618(A)|v1347(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 431 [] v3618(A)| -v3620(A)| -v1347(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 432 [] -v3620(A)|v1673(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 433 [] -v3620(A)|v903(A).
% 95.04/94.38  ** KEPT (pick-wt=6): 434 [] v3620(A)| -v1673(A)| -v903(A).
% 95.04/94.38  ** KEPT (pick-wt=4): 435 [] -v38(A)|v40(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 436 [] v38(A)| -v40(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 437 [] -v40(A)|v42(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 438 [] v40(A)| -v42(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 439 [] -v42(A)|v44(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 440 [] v42(A)| -v44(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 441 [] -v44(A)|v46(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 442 [] v44(A)| -v46(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 443 [] -v46(A)|v48(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 444 [] v46(A)| -v48(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 445 [] -v48(A)|v50(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 446 [] v48(A)| -v50(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 447 [] -v50(A)|v52(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 448 [] v50(A)| -v52(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 449 [] -v52(A)|v54(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 450 [] v52(A)| -v54(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 451 [] -v54(A)|v56(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=5): 452 [] v54(A)| -v56(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 453 [] -v56(A,bitIndex2)|v3601(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 454 [] v56(A,bitIndex2)| -v3601(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=13): 455 [] -nextState(A,B)|v3602(B)| -range_3_0(C)| -v3601(B,C)|v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=13): 456 [] -nextState(A,B)|v3602(B)| -range_3_0(C)|v3601(B,C)| -v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=10): 457 [] -v3602(A)| -range_3_0(B)| -v3601(A,B)|v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 458 [] -v3602(A)| -range_3_0(B)|v3601(A,B)| -v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=7): 459 [] -nextState(A,B)| -v3602(B)|v3603(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 460 [] -nextState(A,B)|v3602(B)| -v3603(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 461 [] -nextState(A,B)| -v3603(B)|v3605(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 462 [] -nextState(A,B)| -v3603(B)|v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=9): 463 [] -nextState(A,B)|v3603(B)| -v3605(B)| -v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 464 [] -nextState(A,B)|v3605(B)|v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 465 [] -nextState(A,B)| -v3605(B)| -v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=6): 466 [] -v67(A,bitIndex2)|v3558(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 467 [] v67(A,bitIndex2)| -v3558(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 468 [] -v3555(A,bitIndex2)|v3556(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 469 [] v3555(A,bitIndex2)| -v3556(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 470 [] -v56(A,bitIndex1)|v3593(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 471 [] v56(A,bitIndex1)| -v3593(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=13): 472 [] -nextState(A,B)|v3594(B)| -range_3_0(C)| -v3593(B,C)|v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=13): 473 [] -nextState(A,B)|v3594(B)| -range_3_0(C)|v3593(B,C)| -v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=10): 474 [] -v3594(A)| -range_3_0(B)| -v3593(A,B)|v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 475 [] -v3594(A)| -range_3_0(B)|v3593(A,B)| -v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=7): 476 [] -nextState(A,B)| -v3594(B)|v3595(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 477 [] -nextState(A,B)|v3594(B)| -v3595(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 478 [] -nextState(A,B)| -v3595(B)|v3597(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 479 [] -nextState(A,B)| -v3595(B)|v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=9): 480 [] -nextState(A,B)|v3595(B)| -v3597(B)| -v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 481 [] -nextState(A,B)|v3597(B)|v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 482 [] -nextState(A,B)| -v3597(B)| -v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=6): 483 [] -v67(A,bitIndex1)|v3558(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 484 [] v67(A,bitIndex1)| -v3558(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 485 [] -v3555(A,bitIndex1)|v3556(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 486 [] v3555(A,bitIndex1)| -v3556(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 487 [] -v56(A,bitIndex3)|v3577(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=6): 488 [] v56(A,bitIndex3)| -v3577(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=13): 489 [] -nextState(A,B)|v3578(B)| -range_3_0(C)| -v3577(B,C)|v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=13): 490 [] -nextState(A,B)|v3578(B)| -range_3_0(C)|v3577(B,C)| -v56(A,C).
% 95.04/94.39  ** KEPT (pick-wt=10): 491 [] -v3578(A)| -range_3_0(B)| -v3577(A,B)|v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 492 [] -v3578(A)| -range_3_0(B)|v3577(A,B)| -v3588(A,B).
% 95.04/94.39  ** KEPT (pick-wt=11): 493 [] -nextState(A,B)| -range_3_0(C)| -v3588(B,C)|v3586(A,C).
% 95.04/94.39  ** KEPT (pick-wt=11): 494 [] -nextState(A,B)| -range_3_0(C)|v3588(B,C)| -v3586(A,C).
% 95.04/94.39  ** KEPT (pick-wt=10): 495 [] v3589(A)| -range_3_0(B)| -v3586(A,B)|v67(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 496 [] v3589(A)| -range_3_0(B)|v3586(A,B)| -v67(A,B).
% 95.04/94.39  ** KEPT (pick-wt=7): 498 [copy,497,propositional] -v3589(A)| -range_3_0(B)| -v3586(A,B).
% 95.04/94.39  ** KEPT (pick-wt=4): 499 [] -v3589(A)| -v58(A).
% 95.04/94.39  ** KEPT (pick-wt=7): 500 [] -nextState(A,B)| -v3578(B)|v3579(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 501 [] -nextState(A,B)|v3578(B)| -v3579(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 502 [] -nextState(A,B)| -v3579(B)|v3580(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 503 [] -nextState(A,B)| -v3579(B)|v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=9): 504 [] -nextState(A,B)|v3579(B)| -v3580(B)| -v3573(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 505 [] -nextState(A,B)|v3580(B)|v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 506 [] -nextState(A,B)| -v3580(B)| -v3582(B).
% 95.04/94.39  ** KEPT (pick-wt=7): 507 [] -nextState(A,B)| -v3582(B)|v3573(A).
% 95.04/94.39  ** KEPT (pick-wt=7): 508 [] -nextState(A,B)|v3582(B)| -v3573(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 509 [] -v3573(A)|v3575(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 510 [] v3573(A)| -v3575(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 511 [] -v3575(A)|v3531(A).
% 95.04/94.39  ** KEPT (pick-wt=4): 512 [] v3575(A)| -v3531(A).
% 95.04/94.39  ** KEPT (pick-wt=6): 513 [] -v67(A,bitIndex3)|v3558(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=6): 514 [] v67(A,bitIndex3)| -v3558(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=10): 515 [] v3559(A)| -range_3_0(B)| -v3558(A,B)|v3560(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 516 [] v3559(A)| -range_3_0(B)|v3558(A,B)| -v3560(A,B).
% 95.04/94.39  ** KEPT (pick-wt=7): 518 [copy,517,propositional] -v3559(A)| -range_3_0(B)| -v3558(A,B).
% 95.04/94.39  ** KEPT (pick-wt=14): 519 [] v3561(A)|v3563(A)|v3567(A)| -range_3_0(B)| -v3560(A,B)|v56(A,B).
% 95.04/94.39  ** KEPT (pick-wt=14): 520 [] v3561(A)|v3563(A)|v3567(A)| -range_3_0(B)|v3560(A,B)| -v56(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 521 [] -v3567(A)| -range_3_0(B)| -v3560(A,B)|v3569(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 522 [] -v3567(A)| -range_3_0(B)|v3560(A,B)| -v3569(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 523 [] -v3563(A)| -range_3_0(B)| -v3560(A,B)|v3565(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 524 [] -v3563(A)| -range_3_0(B)|v3560(A,B)| -v3565(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 525 [] -v3561(A)| -range_3_0(B)| -v3560(A,B)|v56(A,B).
% 95.04/94.39  ** KEPT (pick-wt=10): 526 [] -v3561(A)| -range_3_0(B)|v3560(A,B)| -v56(A,B).
% 95.04/94.39  ** KEPT (pick-wt=5): 528 [copy,527,propositional] -v3570(A)|v3571(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=5): 530 [copy,529,propositional] -v3570(A)|v3571(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=8): 532 [copy,531,propositional] v3570(A)| -v3571(A,bitIndex1)| -v3571(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=5): 533 [] -v3571(A,bitIndex0)|v3447(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 534 [] v3571(A,bitIndex0)| -v3447(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 535 [] -v3571(A,bitIndex1)|v69(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 536 [] v3571(A,bitIndex1)| -v69(A).
% 95.04/94.39  ** KEPT (pick-wt=3): 538 [copy,537,propositional] v3569(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=8): 539 [] -range_3_1(A)| -v3569(B,A)|v3555(B,A).
% 95.04/94.39  ** KEPT (pick-wt=8): 540 [] -range_3_1(A)|v3569(B,A)| -v3555(B,A).
% 95.04/94.39  ** KEPT (pick-wt=11): 541 [] -range_3_1(A)|bitIndex1=A|bitIndex2=A|bitIndex3=A.
% 95.04/94.39  ** KEPT (pick-wt=5): 542 [] range_3_1(A)|bitIndex1!=A.
% 95.04/94.39  ** KEPT (pick-wt=5): 543 [] range_3_1(A)|bitIndex2!=A.
% 95.04/94.39  ** KEPT (pick-wt=5): 544 [] range_3_1(A)|bitIndex3!=A.
% 95.04/94.39  ** KEPT (pick-wt=5): 546 [copy,545,propositional] -v3567(A)|v3568(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=5): 548 [copy,547,propositional] -v3567(A)| -v3568(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=8): 550 [copy,549,propositional] v3567(A)| -v3568(A,bitIndex1)|v3568(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=5): 551 [] -v3568(A,bitIndex0)|v3447(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 552 [] v3568(A,bitIndex0)| -v3447(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 553 [] -v3568(A,bitIndex1)|v69(A).
% 95.04/94.39  ** KEPT (pick-wt=5): 554 [] v3568(A,bitIndex1)| -v69(A).
% 95.04/94.39  ** KEPT (pick-wt=6): 555 [] -v3565(A,bitIndex2)|v56(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=6): 556 [] v3565(A,bitIndex2)| -v56(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=6): 557 [] -v3565(A,bitIndex1)|v56(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 558 [] v3565(A,bitIndex1)| -v56(A,bitIndex2).
% 95.04/94.39  ** KEPT (pick-wt=6): 559 [] -v3565(A,bitIndex0)|v56(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=6): 560 [] v3565(A,bitIndex0)| -v56(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=3): 562 [copy,561,propositional] -v3565(A,bitIndex3).
% 95.04/94.39  ** KEPT (pick-wt=5): 564 [copy,563,propositional] -v3563(A)| -v3564(A,bitIndex1).
% 95.04/94.39  ** KEPT (pick-wt=5): 566 [copy,565,propositional] -v3563(A)|v3564(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=8): 568 [copy,567,propositional] v3563(A)|v3564(A,bitIndex1)| -v3564(A,bitIndex0).
% 95.04/94.39  ** KEPT (pick-wt=5): 569 [] -v3564(A,bitIndex0)|v3447(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 570 [] v3564(A,bitIndex0)| -v3447(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 571 [] -v3564(A,bitIndex1)|v69(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 572 [] v3564(A,bitIndex1)| -v69(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 574 [copy,573,propositional] -v3561(A)| -v3562(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=5): 576 [copy,575,propositional] -v3561(A)| -v3562(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=5): 577 [] -v3562(A,bitIndex0)|v3447(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 578 [] v3562(A,bitIndex0)| -v3447(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 579 [] -v3562(A,bitIndex1)|v69(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 580 [] v3562(A,bitIndex1)| -v69(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 581 [] -v3559(A)| -v58(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 582 [] -v3555(A,bitIndex3)|v3556(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=6): 583 [] v3555(A,bitIndex3)| -v3556(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=3): 585 [copy,584,propositional] -v3556(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=6): 586 [] -v3556(A,bitIndex3)|v56(A,bitIndex2).
% 95.04/94.40  ** KEPT (pick-wt=6): 587 [] v3556(A,bitIndex3)| -v56(A,bitIndex2).
% 95.04/94.40  ** KEPT (pick-wt=6): 588 [] -v3556(A,bitIndex2)|v56(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=6): 589 [] v3556(A,bitIndex2)| -v56(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=6): 590 [] -v3556(A,bitIndex1)|v56(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=6): 591 [] v3556(A,bitIndex1)| -v56(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=5): 593 [copy,592,propositional] -range_3_0(A)| -v56(constB0,A).
% 95.04/94.40  ** KEPT (pick-wt=4): 594 [] -v3447(A)|v3449(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 595 [] v3447(A)| -v3449(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 596 [] -v3449(A)|v3451(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 597 [] v3449(A)| -v3451(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 599 [copy,598,propositional] v3551(A)|v3552(A)| -v3451(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 601 [copy,600,propositional] -v3552(A)|v3451(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 603 [copy,602,propositional] -v3551(A)| -v3451(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 605 [copy,604,propositional] -v3552(A)| -v3453(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=5): 607 [copy,606,propositional] -v3552(A)|v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=8): 609 [copy,608,propositional] v3552(A)|v3453(A,bitIndex1)| -v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=5): 611 [copy,610,propositional] -v3551(A)| -v3453(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=5): 613 [copy,612,propositional] -v3551(A)| -v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=13): 614 [] -nextState(A,B)|v3536(B)| -range_1_0(C)| -v3453(B,C)|v3453(A,C).
% 95.04/94.40  ** KEPT (pick-wt=13): 615 [] -nextState(A,B)|v3536(B)| -range_1_0(C)|v3453(B,C)| -v3453(A,C).
% 95.04/94.40  ** KEPT (pick-wt=10): 616 [] -v3536(A)| -range_1_0(B)| -v3453(A,B)|v3546(A,B).
% 95.04/94.40  ** KEPT (pick-wt=10): 617 [] -v3536(A)| -range_1_0(B)|v3453(A,B)| -v3546(A,B).
% 95.04/94.40  ** KEPT (pick-wt=11): 618 [] -nextState(A,B)| -range_1_0(C)| -v3546(B,C)|v3544(A,C).
% 95.04/94.40  ** KEPT (pick-wt=11): 619 [] -nextState(A,B)| -range_1_0(C)|v3546(B,C)| -v3544(A,C).
% 95.04/94.40  ** KEPT (pick-wt=10): 620 [] v3547(A)| -range_1_0(B)| -v3544(A,B)|v3455(A,B).
% 95.04/94.40  ** KEPT (pick-wt=10): 621 [] v3547(A)| -range_1_0(B)|v3544(A,B)| -v3455(A,B).
% 95.04/94.40  ** KEPT (pick-wt=7): 623 [copy,622,propositional] -v3547(A)| -range_1_0(B)| -v3544(A,B).
% 95.04/94.40  ** KEPT (pick-wt=4): 625 [copy,624,propositional] -v3547(A)| -v62(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 626 [] -nextState(A,B)| -v3536(B)|v3537(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 627 [] -nextState(A,B)|v3536(B)| -v3537(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 628 [] -nextState(A,B)| -v3537(B)|v3538(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 629 [] -nextState(A,B)| -v3537(B)|v3531(B).
% 95.04/94.40  ** KEPT (pick-wt=9): 630 [] -nextState(A,B)|v3537(B)| -v3538(B)| -v3531(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 631 [] -nextState(A,B)|v3538(B)|v3540(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 632 [] -nextState(A,B)| -v3538(B)| -v3540(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 633 [] -nextState(A,B)| -v3540(B)|v3531(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 634 [] -nextState(A,B)|v3540(B)| -v3531(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 635 [] -v3531(A)|v3533(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 636 [] v3531(A)| -v3533(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 637 [] -v3533(A)|v1(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 638 [] v3533(A)| -v1(A).
% 95.04/94.40  ** KEPT (pick-wt=9): 640 [copy,639,propositional] v3520(A)|v3529(A)| -range_1_0(B)| -v3455(A,B).
% 95.04/94.40  ** KEPT (pick-wt=7): 642 [copy,641,propositional] -v3529(A)| -range_1_0(B)| -v3455(A,B).
% 95.04/94.40  ** KEPT (pick-wt=10): 643 [] -v3520(A)| -range_1_0(B)| -v3455(A,B)|v3521(A,B).
% 95.04/94.40  ** KEPT (pick-wt=10): 644 [] -v3520(A)| -range_1_0(B)|v3455(A,B)| -v3521(A,B).
% 95.04/94.40  ** KEPT (pick-wt=5): 646 [copy,645,propositional] -v3529(A)| -v3453(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=5): 648 [copy,647,propositional] -v3529(A)|v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=8): 650 [copy,649,propositional] v3529(A)|v3453(A,bitIndex1)| -v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=10): 651 [] v3522(A)| -range_1_0(B)| -v3521(A,B)|v3524(A,B).
% 95.04/94.40  ** KEPT (pick-wt=10): 652 [] v3522(A)| -range_1_0(B)|v3521(A,B)| -v3524(A,B).
% 95.04/94.40  ** KEPT (pick-wt=7): 654 [copy,653,propositional] -v3522(A)| -range_1_0(B)| -v3521(A,B).
% 95.04/94.40  ** KEPT (pick-wt=9): 655 [] v3525(A)| -range_1_0(B)| -v3524(A,B)|b01(B).
% 95.04/94.40  ** KEPT (pick-wt=9): 656 [] v3525(A)| -range_1_0(B)|v3524(A,B)| -b01(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 658 [copy,657,propositional] -v3525(A)| -range_1_0(B)| -v3524(A,B).
% 95.04/94.40  ** KEPT (pick-wt=4): 660 [copy,659,propositional] -v3527(A)| -v3528(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 661 [] -v3528(A)|v3500(A)|v3502(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 662 [] v3528(A)| -v3500(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 663 [] v3528(A)| -v3502(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 665 [copy,664,propositional] -v3525(A)|v3526(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 667 [copy,666,propositional] v3525(A)| -v3526(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 668 [] -v3526(A)|v3500(A)|v3502(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 669 [] v3526(A)| -v3500(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 670 [] v3526(A)| -v3502(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 672 [copy,671,propositional] -v3500(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 674 [copy,673,propositional] -v3523(A)| -v3457(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 676 [copy,675,propositional] -v3522(A)|v3457(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 678 [copy,677,propositional] v3522(A)| -v3457(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 680 [copy,679,propositional] -v3520(A)| -v3453(A,bitIndex1).
% 95.04/94.40  ** KEPT (pick-wt=5): 682 [copy,681,propositional] -v3520(A)| -v3453(A,bitIndex0).
% 95.04/94.40  ** KEPT (pick-wt=5): 684 [copy,683,propositional] -range_1_0(A)| -v3453(constB0,A).
% 95.04/94.40  ** KEPT (pick-wt=4): 685 [] -v3502(A)|v3504(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 686 [] v3502(A)| -v3504(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 687 [] -v3504(A)|v3506(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 688 [] v3504(A)| -v3506(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 689 [] -v3506(A)|v3508(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 690 [] v3506(A)| -v3508(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 691 [] -v3508(A)|v3510(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 692 [] v3508(A)| -v3510(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 693 [] -v3510(A)|v3512(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 694 [] v3510(A)| -v3512(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 695 [] -v3512(A)|v3514(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 696 [] v3512(A)| -v3514(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 697 [] -v3514(A)|v3516(A,bitIndex6).
% 95.04/94.40  ** KEPT (pick-wt=5): 698 [] v3514(A)| -v3516(A,bitIndex6).
% 95.04/94.40  ** KEPT (pick-wt=3): 699 [] -v3516(constB0,bitIndex6).
% 95.04/94.40  ** KEPT (pick-wt=2): 700 [] -bx0xxxxxx(bitIndex6).
% 95.04/94.40  ** KEPT (pick-wt=4): 701 [] -v3457(A)|v3459(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 702 [] v3457(A)| -v3459(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 703 [] -v3459(A)|v3493(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 704 [] -v3459(A)|v3489(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 705 [] v3459(A)| -v3493(A)| -v3489(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 706 [] -v3493(A)|v3494(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 707 [] -v3493(A)|v3485(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 708 [] v3493(A)| -v3494(A)| -v3485(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 709 [] -v3494(A)|v3495(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 710 [] -v3494(A)|v3481(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 711 [] v3494(A)| -v3495(A)| -v3481(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 712 [] -v3495(A)|v3496(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 713 [] -v3495(A)|v3477(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 714 [] v3495(A)| -v3496(A)| -v3477(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 715 [] -v3496(A)|v3497(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 716 [] -v3496(A)|v3473(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 717 [] v3496(A)| -v3497(A)| -v3473(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 718 [] -v3497(A)|v3498(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 719 [] -v3497(A)|v3469(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 720 [] v3497(A)| -v3498(A)| -v3469(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 721 [] -v3498(A)|v3461(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 722 [] -v3498(A)|v3465(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 723 [] v3498(A)| -v3461(A)| -v3465(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 724 [] -v3489(A)|v3491(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 725 [] v3489(A)| -v3491(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 727 [copy,726,propositional] v3491(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 728 [] -v3485(A)|v3487(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 729 [] v3485(A)| -v3487(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 731 [copy,730,propositional] v3487(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 732 [] -v3481(A)|v3483(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 733 [] v3481(A)| -v3483(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 735 [copy,734,propositional] v3483(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 736 [] -v3477(A)|v3479(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 737 [] v3477(A)| -v3479(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 739 [copy,738,propositional] v3479(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 740 [] -v3473(A)|v3475(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 741 [] v3473(A)| -v3475(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 743 [copy,742,propositional] v3475(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 744 [] -v3469(A)|v3471(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 745 [] v3469(A)| -v3471(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 747 [copy,746,propositional] v3471(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 748 [] -v3465(A)|v3467(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 749 [] v3465(A)| -v3467(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 751 [copy,750,propositional] v3467(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 752 [] -v3461(A)|v3463(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 753 [] v3461(A)| -v3463(A).
% 95.04/94.40  ** KEPT (pick-wt=2): 755 [copy,754,propositional] v3463(constB0).
% 95.04/94.40  ** KEPT (pick-wt=4): 756 [] -v69(A)|v71(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 757 [] v69(A)| -v71(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 758 [] -v71(A)|v73(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 759 [] v71(A)| -v73(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 760 [] -v73(A)|v75(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 761 [] v73(A)| -v75(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 762 [] -v75(A)|v77(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 763 [] v75(A)| -v77(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 764 [] -v77(A)|v79(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 765 [] v77(A)| -v79(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 766 [] -v79(A)|v81(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 767 [] v79(A)| -v81(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 768 [] -v81(A)|v83(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 769 [] v81(A)| -v83(A).
% 95.04/94.40  ** KEPT (pick-wt=9): 770 [] -nextState(A,B)|v3426(B)| -v83(B)|v83(A).
% 95.04/94.40  ** KEPT (pick-wt=9): 771 [] -nextState(A,B)|v3426(B)|v83(B)| -v83(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 772 [] -v3426(A)| -v83(A)|v3434(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 773 [] -v3426(A)|v83(A)| -v3434(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 774 [] -nextState(A,B)| -v3434(B)|v3432(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 775 [] -nextState(A,B)|v3434(B)| -v3432(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 776 [] v3435(A)| -v3432(A)|v3436(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 777 [] v3435(A)|v3432(A)| -v3436(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 779 [copy,778,propositional] -v3435(A)| -v3432(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 781 [copy,780,propositional] v3437(A)| -v3436(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 783 [copy,782,propositional] -v3437(A)|v3436(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 784 [] -v3437(A)|v3438(A)|v3442(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 785 [] v3437(A)| -v3438(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 786 [] v3437(A)| -v3442(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 787 [] -v3442(A)|v31(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=4): 788 [] -v3442(A)|v3443(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 789 [] v3442(A)| -v31(A,bitIndex9)| -v3443(A).
% 95.04/94.40  ** KEPT (pick-wt=5): 790 [] -v3443(A)| -v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=6): 791 [] -v3438(A)|v3439(A)|v3420(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 792 [] v3438(A)| -v3439(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 793 [] v3438(A)| -v3420(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 794 [] -v3439(A)|v3440(A)|v3415(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 795 [] v3439(A)| -v3440(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 796 [] v3439(A)| -v3415(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 797 [] -v3440(A)|v3441(A)|v879(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 798 [] v3440(A)| -v3441(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 799 [] v3440(A)| -v879(A).
% 95.04/94.40  ** KEPT (pick-wt=6): 800 [] -v3441(A)|v85(A)|v3410(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 801 [] v3441(A)| -v85(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 802 [] v3441(A)| -v3410(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 803 [] -v3435(A)| -v33(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 804 [] -nextState(A,B)| -v3426(B)|v3427(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 805 [] -nextState(A,B)|v3426(B)| -v3427(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 806 [] -nextState(A,B)| -v3427(B)|v3428(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 807 [] -nextState(A,B)| -v3427(B)|v1240(B).
% 95.04/94.40  ** KEPT (pick-wt=9): 808 [] -nextState(A,B)|v3427(B)| -v3428(B)| -v1240(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 809 [] -nextState(A,B)|v3428(B)|v1247(B).
% 95.04/94.40  ** KEPT (pick-wt=7): 810 [] -nextState(A,B)| -v3428(B)| -v1247(B).
% 95.04/94.40  ** KEPT (pick-wt=5): 811 [] -v3420(A)|v31(A,bitIndex8).
% 95.04/94.40  ** KEPT (pick-wt=4): 812 [] -v3420(A)|v3422(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 813 [] v3420(A)| -v31(A,bitIndex8)| -v3422(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 814 [] -v3422(A)| -v3423(A).
% 95.04/94.40  ** 
% 95.04/94.40  Search stopped in tp_alloc by max_mem option.
% 95.04/94.40  KEPT (pick-wt=8): 815 [] -v3423(A)|v36(A,bitIndex8)|v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 816 [] v3423(A)| -v36(A,bitIndex8).
% 95.04/94.40  ** KEPT (pick-wt=5): 817 [] v3423(A)| -v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 818 [] -v3415(A)|v31(A,bitIndex5).
% 95.04/94.40  ** KEPT (pick-wt=4): 819 [] -v3415(A)|v3417(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 820 [] v3415(A)| -v31(A,bitIndex5)| -v3417(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 821 [] -v3417(A)| -v3418(A).
% 95.04/94.40  ** KEPT (pick-wt=8): 822 [] -v3418(A)|v36(A,bitIndex5)|v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 823 [] v3418(A)| -v36(A,bitIndex5).
% 95.04/94.40  ** KEPT (pick-wt=5): 824 [] v3418(A)| -v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 825 [] -v3410(A)|v31(A,bitIndex2).
% 95.04/94.40  ** KEPT (pick-wt=4): 826 [] -v3410(A)|v3412(A).
% 95.04/94.40  ** KEPT (pick-wt=7): 827 [] v3410(A)| -v31(A,bitIndex2)| -v3412(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 828 [] -v3412(A)| -v3413(A).
% 95.04/94.40  ** KEPT (pick-wt=8): 829 [] -v3413(A)|v36(A,bitIndex2)|v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 830 [] v3413(A)| -v36(A,bitIndex2).
% 95.04/94.40  ** KEPT (pick-wt=5): 831 [] v3413(A)| -v36(A,bitIndex9).
% 95.04/94.40  ** KEPT (pick-wt=5): 832 [] -v85(A)|v36(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=5): 833 [] v85(A)| -v36(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=5): 835 [copy,834,propositional] v3398(A)| -v36(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=5): 837 [copy,836,propositional] -v3398(A)|v36(A,bitIndex3).
% 95.04/94.40  ** KEPT (pick-wt=6): 838 [] -v3398(A)|v3399(A)|v3407(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 839 [] v3398(A)| -v3399(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 840 [] v3398(A)| -v3407(A).
% 95.04/94.40  ** KEPT (pick-wt=4): 841 [] -v3407(A)|v3408(A).
% 95.04/94.40  
% 95.04/94.40  Search stopped in tp_alloc by max_mem option.
% 95.04/94.40  
% 95.04/94.40  ============ end of search ============
% 95.04/94.40  
% 95.04/94.40  -------------- statistics -------------
% 95.04/94.40  clauses given                  0
% 95.04/94.40  clauses generated              0
% 95.04/94.40  clauses kept                 775
% 95.04/94.40  clauses forward subsumed       0
% 95.04/94.40  clauses back subsumed          0
% 95.04/94.40  Kbytes malloced            11718
% 95.04/94.40  
% 95.04/94.40  ----------- times (seconds) -----------
% 95.04/94.40  user CPU time          2.44          (0 hr, 0 min, 2 sec)
% 95.04/94.40  system CPU time        0.00          (0 hr, 0 min, 0 sec)
% 95.04/94.40  wall-clock time       94             (0 hr, 1 min, 34 sec)
% 95.04/94.41  
% 95.04/94.41  Process 8614 finished Wed Jul 27 06:35:42 2022
% 95.04/94.41  Otter interrupted
% 95.04/94.41  PROOF NOT FOUND
%------------------------------------------------------------------------------