Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build stability fixes - part 1 #236

Merged
merged 6 commits into from
Dec 18, 2023
Merged

Conversation

msmygit
Copy link
Collaborator

@msmygit msmygit commented Dec 15, 2023

No description provided.

@msmygit msmygit self-assigned this Dec 15, 2023
@@ -81,7 +81,7 @@ _testDockerNetwork() {
}

_testDockerCassandra() {
dockerPs=$(docker ps -a | awk '{if ($NF == "'${DOCKER_CASS}'") {print "yes"}}')
dockerPs=$(docker ps --all --filter "name=${DOCKER_CASS}" --format "{{.Status}}" | awk '{if ($1 == "Up") {print "yes"}}')
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change will ensure only validate and running docker containers will be considered

@@ -154,7 +154,7 @@ _dropKeyspaces() {
}

_testDockerCDM() {
dockerPs=$(docker ps -a | awk '{if ($NF == "'${DOCKER_CDM}'") {print "yes"}}')
dockerPs=$(docker ps --all --filter "name=${DOCKER_CDM}" --format "{{.Status}}" | awk '{if ($1 == "Up") {print "yes"}}')
Copy link
Collaborator Author

@msmygit msmygit Dec 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change will ensure only valid and running docker containers will be considered

@@ -124,17 +128,17 @@ errors=0
for testDir in $(ls -d ${PHASE}/*); do
export testDir
_info ${testDir} Executing test
docker exec ${DOCKER_CDM} bash -e $testDir/execute.sh /$testDir > $testDir/output/execute.out 2>$testDir/output/execute.err
docker exec ${DOCKER_CDM} bash -e -c "$testDir/execute.sh /$testDir > $testDir/output/execute.out 2>$testDir/output/execute.err"
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks to Arvy for this tip!

errors=1
continue
fi

_info "PASS: ${testDir} returned expected results"
done
if [ $errors -ne 0 ]; then
_captureOutput
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having it here was only capturing the last run's output alone. Now, it will capture whenever there is a failure.

@@ -117,7 +117,6 @@ public boolean initializeAndValidate(CqlTable originTable, CqlTable targetTable)
logger.error("Counter table cannot specify TTL or WriteTimestamp columns as they cannot set on write");
isValid = false;
isEnabled = false;
return false;
Copy link
Collaborator Author

@msmygit msmygit Dec 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an attempt for now to see if counter table will skip the auto TTL & Writetime - presently this gives errors with smoke/02_autocorrect_kvp (unrelated) - triage happening.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the output of the execute.out file,

% cat smoke/02_autocorrect_kvp/output/execute.out 
2023-12-15 21:51:22 INFO  [] Migrate$:139 - ################################################################################################
2023-12-15 21:51:22 INFO  [] Migrate$:140 - ###                                  Migrate Job - Starting                                  ###
2023-12-15 21:51:22 INFO  [] Migrate$:141 - ################################################################################################
2023-12-15 21:51:22 WARN  [] NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-12-15 21:51:23 INFO  [] PropertyHelper:225 - Processing explicitly set and known sparkConf properties
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.schema.origin.keyspaceTable] is configured with value [origin.smoke_autocorrect_kvp] and is type [STRING]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.schema.target.keyspaceTable] is configured with value [target.smoke_autocorrect_kvp] and is type [STRING]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.connect.target.host] is configured with value [cdm-sit-cass] and is type [STRING]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.autocorrect.missing] is configured with value [true] and is type [BOOLEAN]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.perfops.numParts] is configured with value [1] and is type [NUMBER]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.connect.origin.host] is configured with value [cdm-sit-cass] and is type [STRING]
2023-12-15 21:51:23 INFO  [] PropertyHelper:238 - Known property [spark.cdm.autocorrect.mismatch] is configured with value [true] and is type [BOOLEAN]
2023-12-15 21:51:23 INFO  [] PropertyHelper:243 - Adding any missing known properties that have default values
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.ratelimit.target] with default value [40000]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.password] with default value [cassandra]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.consistency.write] with default value [LOCAL_QUORUM]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.printStatsPerPart] with default value [false]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.port] with default value [9042]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.username] with default value [cassandra]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.enabledAlgorithms] with default value [TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.codecs.timestamp.string.format] with default value [yyyyMMddHHmmss]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.custom.writetime] with default value [0]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [test.numberList] with default value [1,2]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.username] with default value [cassandra]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.schema.origin.column.writetime.automatic] with default value [true]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.filter.java.token.percent] with default value [100]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.enabledAlgorithms] with default value [TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.custom.writetime.incrementBy] with default value [0]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.printStatsAfter] with default value [100000]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.ratelimit.origin] with default value [20000]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.batchSize] with default value [5]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [test.stringList] with default value [text1,text2]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.port] with default value [9042]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.consistency.read] with default value [LOCAL_QUORUM]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.feature.guardrail.colSizeInKB] with default value [0]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.password] with default value [cassandra]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.enabled] with default value [false]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [test.number] with default value [1]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [test.boolean] with default value [true]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.feature.constantColumns.splitRegex] with default value [,]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.codecs.timestamp.string.zone] with default value [UTC]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.trustStore.type] with default value [JKS]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.schema.origin.column.ttl.automatic] with default value [true]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.trustStore.type] with default value [JKS]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.fetchSizeInRows] with default value [1000]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.autocorrect.missing.counter] with default value [false]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [test.string] with default value [text]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.enabled] with default value [false]
2023-12-15 21:51:23 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.errorLimit] with default value [0]
2023-12-15 21:51:23 INFO  [] ConnectionFetcher:66 - PARAM --  SSL Enabled: false
2023-12-15 21:51:23 INFO  [] ConnectionFetcher:100 - Connecting to ORIGIN at cdm-sit-cass:9042
2023-12-15 21:51:24 INFO  [] ConnectionFetcher:66 - PARAM --  SSL Enabled: false
2023-12-15 21:51:24 INFO  [] ConnectionFetcher:100 - Connecting to TARGET at cdm-sit-cass:9042
2023-12-15 21:51:25 WARN  [] PlainTextAuthProviderBase:81 - [] cdm-sit-cass/172.16.242.2:9042 did not send an authentication challenge; This is suspicious because the driver expects authentication
2023-12-15 21:51:25 WARN  [] PlainTextAuthProviderBase:81 - [] cdm-sit-cass/172.16.242.2:9042 did not send an authentication challenge; This is suspicious because the driver expects authentication
2023-12-15 21:51:26 INFO  [] Migrate$:85 - PARAM -- Min Partition: -9223372036854775808
2023-12-15 21:51:26 INFO  [] Migrate$:86 - PARAM -- Max Partition: 9223372036854775807
2023-12-15 21:51:26 INFO  [] Migrate$:87 - PARAM -- Number of Splits : 1
2023-12-15 21:51:26 INFO  [] Migrate$:88 - PARAM -- Coverage Percent: 100
2023-12-15 21:51:26 INFO  [] SplitPartitions:39 - ThreadID: 1 Splitting min: -9223372036854775808 max: 9223372036854775807
2023-12-15 21:51:26 INFO  [] Migrate$:91 - PARAM Calculated -- Total Partitions: 1
2023-12-15 21:51:26 INFO  [] Migrate$:92 - Spark parallelize created : 1 slices!
2023-12-15 21:51:26 INFO  [] CopyJobSession:64 - PARAM -- Max Retries: 0
2023-12-15 21:51:26 INFO  [] CopyJobSession:65 - PARAM -- Partition file: ./origin.smoke_autocorrect_kvp_partitions.csv
2023-12-15 21:51:26 INFO  [] CopyJobSession:66 - PARAM -- Origin Rate Limit: 20000.0
2023-12-15 21:51:26 INFO  [] CopyJobSession:67 - PARAM -- Target Rate Limit: 40000.0
2023-12-15 21:51:26 DEBUG [] CqlConversion:60 - originColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value]-[0, 1]
2023-12-15 21:51:26 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:60 - originColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlTable:335 - Corresponding index for target: [key, value]-[0, 1]
2023-12-15 21:51:26 DEBUG [] CqlConversion:117 - getConversions() - From target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:118 - getConversions() -   To origin columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] Guardrail:94 - Guardrail is disabled. colSizeInKB=0.0
2023-12-15 21:51:26 INFO  [] WritetimeTTL:134 - PARAM -- Automatic TTLCols: [value]
2023-12-15 21:51:26 INFO  [] WritetimeTTL:139 - PARAM -- Automatic WriteTimestampCols: [value]
2023-12-15 21:51:26 DEBUG [] CqlConversion:60 - originColumnNames: [key, value, TTL(value)]
2023-12-15 21:51:26 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value, TTL(value)]-[0, 1, -1]
2023-12-15 21:51:26 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value, TTL(value)] of types [TEXT, TEXT, INT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:60 - originColumnNames: [key, value, TTL(value), WRITETIME(value)]
2023-12-15 21:51:26 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:26 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value, TTL(value), WRITETIME(value)]-[0, 1, -1, -1]
2023-12-15 21:51:26 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value, TTL(value), WRITETIME(value)] of types [TEXT, TEXT, INT, BIGINT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:26 INFO  [] WritetimeTTL:157 - Feature WritetimeTTL is enabled
2023-12-15 21:51:26 INFO  [] CopyJobSession:55 - CQL -- origin select: SELECT key,value,TTL(value),WRITETIME(value) FROM origin.smoke_autocorrect_kvp WHERE TOKEN(key) >= ? AND TOKEN(key) <= ? ALLOW FILTERING
2023-12-15 21:51:26 INFO  [] CopyJobSession:56 - CQL -- target select: SELECT key,value FROM target.smoke_autocorrect_kvp WHERE key=?
2023-12-15 21:51:26 INFO  [] CopyJobSession:57 - CQL -- target upsert: INSERT INTO target.smoke_autocorrect_kvp (key,value) VALUES (?,?) USING TTL ? AND TIMESTAMP ?
2023-12-15 21:51:27 INFO  [-9223372036854775808: 9223372036854775807] CopyJobSession:67 - ThreadID: 67 Processing min: -9223372036854775808 max: 9223372036854775807
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key1]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key1], ttl=0, writeTimestamp=1702677059832149, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] TargetInsertStatement:56 - bind using conversions: [CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, null, null]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key3]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key3], ttl=0, writeTimestamp=1702677059835401, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] TargetInsertStatement:56 - bind using conversions: [CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, null, null]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key2]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key2], ttl=0, writeTimestamp=1702677059833807, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:27 DEBUG [-9223372036854775808: 9223372036854775807] TargetInsertStatement:56 - bind using conversions: [CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, null, null]
2023-12-15 21:51:27 INFO  [] JobCounter:179 - ################################################################################################
2023-12-15 21:51:27 INFO  [] JobCounter:180 - Final Read Record Count: 3
2023-12-15 21:51:27 INFO  [] JobCounter:186 - Final Skipped Record Count: 0
2023-12-15 21:51:27 INFO  [] JobCounter:187 - Final Write Record Count: 3
2023-12-15 21:51:27 INFO  [] JobCounter:188 - Final Error Record Count: 0
2023-12-15 21:51:27 INFO  [] JobCounter:190 - ################################################################################################
2023-12-15 21:51:27 INFO  [] Migrate$:139 - ################################################################################################
2023-12-15 21:51:27 INFO  [] Migrate$:140 - ###                                  Migrate Job - Stopped                                   ###
2023-12-15 21:51:27 INFO  [] Migrate$:141 - ################################################################################################
2023-12-15 21:51:29 INFO  [] DiffData$:139 - ################################################################################################
2023-12-15 21:51:29 INFO  [] DiffData$:140 - ###                              Data Validation Job - Starting                              ###
2023-12-15 21:51:29 INFO  [] DiffData$:141 - ################################################################################################
2023-12-15 21:51:29 WARN  [] NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-12-15 21:51:30 INFO  [] PropertyHelper:225 - Processing explicitly set and known sparkConf properties
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.schema.origin.keyspaceTable] is configured with value [origin.smoke_autocorrect_kvp] and is type [STRING]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.schema.target.keyspaceTable] is configured with value [target.smoke_autocorrect_kvp] and is type [STRING]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.connect.target.host] is configured with value [cdm-sit-cass] and is type [STRING]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.autocorrect.missing] is configured with value [true] and is type [BOOLEAN]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.perfops.numParts] is configured with value [1] and is type [NUMBER]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.connect.origin.host] is configured with value [cdm-sit-cass] and is type [STRING]
2023-12-15 21:51:30 INFO  [] PropertyHelper:238 - Known property [spark.cdm.autocorrect.mismatch] is configured with value [true] and is type [BOOLEAN]
2023-12-15 21:51:30 INFO  [] PropertyHelper:243 - Adding any missing known properties that have default values
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.ratelimit.target] with default value [40000]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.password] with default value [cassandra]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.consistency.write] with default value [LOCAL_QUORUM]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.printStatsPerPart] with default value [false]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.port] with default value [9042]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.username] with default value [cassandra]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.enabledAlgorithms] with default value [TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.codecs.timestamp.string.format] with default value [yyyyMMddHHmmss]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.custom.writetime] with default value [0]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [test.numberList] with default value [1,2]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.username] with default value [cassandra]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.schema.origin.column.writetime.automatic] with default value [true]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.filter.java.token.percent] with default value [100]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.enabledAlgorithms] with default value [TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.custom.writetime.incrementBy] with default value [0]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.printStatsAfter] with default value [100000]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.ratelimit.origin] with default value [20000]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.batchSize] with default value [5]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [test.stringList] with default value [text1,text2]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.port] with default value [9042]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.consistency.read] with default value [LOCAL_QUORUM]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.feature.guardrail.colSizeInKB] with default value [0]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.password] with default value [cassandra]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.enabled] with default value [false]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [test.number] with default value [1]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [test.boolean] with default value [true]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.feature.constantColumns.splitRegex] with default value [,]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.transform.codecs.timestamp.string.zone] with default value [UTC]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.target.tls.trustStore.type] with default value [JKS]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.schema.origin.column.ttl.automatic] with default value [true]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.trustStore.type] with default value [JKS]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.fetchSizeInRows] with default value [1000]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.autocorrect.missing.counter] with default value [false]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [test.string] with default value [text]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.connect.origin.tls.enabled] with default value [false]
2023-12-15 21:51:30 DEBUG [] PropertyHelper:248 - Setting known property [spark.cdm.perfops.errorLimit] with default value [0]
2023-12-15 21:51:30 INFO  [] ConnectionFetcher:66 - PARAM --  SSL Enabled: false
2023-12-15 21:51:30 INFO  [] ConnectionFetcher:100 - Connecting to ORIGIN at cdm-sit-cass:9042
2023-12-15 21:51:31 INFO  [] ConnectionFetcher:66 - PARAM --  SSL Enabled: false
2023-12-15 21:51:31 INFO  [] ConnectionFetcher:100 - Connecting to TARGET at cdm-sit-cass:9042
2023-12-15 21:51:32 WARN  [] PlainTextAuthProviderBase:81 - [] cdm-sit-cass/172.16.242.2:9042 did not send an authentication challenge; This is suspicious because the driver expects authentication
2023-12-15 21:51:33 WARN  [] PlainTextAuthProviderBase:81 - [] cdm-sit-cass/172.16.242.2:9042 did not send an authentication challenge; This is suspicious because the driver expects authentication
2023-12-15 21:51:33 INFO  [] DiffData$:85 - PARAM -- Min Partition: -9223372036854775808
2023-12-15 21:51:33 INFO  [] DiffData$:86 - PARAM -- Max Partition: 9223372036854775807
2023-12-15 21:51:33 INFO  [] DiffData$:87 - PARAM -- Number of Splits : 1
2023-12-15 21:51:33 INFO  [] DiffData$:88 - PARAM -- Coverage Percent: 100
2023-12-15 21:51:33 INFO  [] SplitPartitions:39 - ThreadID: 1 Splitting min: -9223372036854775808 max: 9223372036854775807
2023-12-15 21:51:33 INFO  [] DiffData$:91 - PARAM Calculated -- Total Partitions: 1
2023-12-15 21:51:33 INFO  [] DiffData$:92 - Spark parallelize created : 1 slices!
2023-12-15 21:51:34 INFO  [] DiffJobSession:64 - PARAM -- Max Retries: 0
2023-12-15 21:51:34 INFO  [] DiffJobSession:65 - PARAM -- Partition file: ./origin.smoke_autocorrect_kvp_partitions.csv
2023-12-15 21:51:34 INFO  [] DiffJobSession:66 - PARAM -- Origin Rate Limit: 20000.0
2023-12-15 21:51:34 INFO  [] DiffJobSession:67 - PARAM -- Target Rate Limit: 40000.0
2023-12-15 21:51:34 DEBUG [] CqlConversion:60 - originColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value]-[0, 1]
2023-12-15 21:51:34 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:60 - originColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlTable:335 - Corresponding index for target: [key, value]-[0, 1]
2023-12-15 21:51:34 DEBUG [] CqlConversion:117 - getConversions() - From target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:118 - getConversions() -   To origin columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] Guardrail:94 - Guardrail is disabled. colSizeInKB=0.0
2023-12-15 21:51:34 INFO  [] WritetimeTTL:134 - PARAM -- Automatic TTLCols: [value]
2023-12-15 21:51:34 INFO  [] WritetimeTTL:139 - PARAM -- Automatic WriteTimestampCols: [value]
2023-12-15 21:51:34 DEBUG [] CqlConversion:60 - originColumnNames: [key, value, TTL(value)]
2023-12-15 21:51:34 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value, TTL(value)]-[0, 1, -1]
2023-12-15 21:51:34 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value, TTL(value)] of types [TEXT, TEXT, INT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:60 - originColumnNames: [key, value, TTL(value), WRITETIME(value)]
2023-12-15 21:51:34 DEBUG [] CqlConversion:61 - targetColumnNames: [key, value]
2023-12-15 21:51:34 DEBUG [] CqlTable:335 - Corresponding index for origin: [key, value, TTL(value), WRITETIME(value)]-[0, 1, -1, -1]
2023-12-15 21:51:34 DEBUG [] CqlConversion:117 - getConversions() - From origin columns [key, value, TTL(value), WRITETIME(value)] of types [TEXT, TEXT, INT, BIGINT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:118 - getConversions() -   To target columns [key, value] of types [TEXT, TEXT]
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 DEBUG [] CqlConversion:64 - CqlConversion() - fromDataType: TEXT/PRIMITIVE toDataType: TEXT/PRIMITIVE
2023-12-15 21:51:34 INFO  [] WritetimeTTL:157 - Feature WritetimeTTL is enabled
2023-12-15 21:51:34 INFO  [] DiffJobSession:55 - CQL -- origin select: SELECT key,value,TTL(value),WRITETIME(value) FROM origin.smoke_autocorrect_kvp WHERE TOKEN(key) >= ? AND TOKEN(key) <= ? ALLOW FILTERING
2023-12-15 21:51:34 INFO  [] DiffJobSession:56 - CQL -- target select: SELECT key,value FROM target.smoke_autocorrect_kvp WHERE key=?
2023-12-15 21:51:34 INFO  [] DiffJobSession:57 - CQL -- target upsert: INSERT INTO target.smoke_autocorrect_kvp (key,value) VALUES (?,?) USING TTL ? AND TIMESTAMP ?
2023-12-15 21:51:34 INFO  [] DiffJobSession:69 - PARAM -- Autocorrect Missing: true
2023-12-15 21:51:34 INFO  [] DiffJobSession:72 - PARAM -- Autocorrect Mismatch: true
2023-12-15 21:51:34 INFO  [] DiffJobSession:101 - CQL -- origin select: SELECT key,value,TTL(value),WRITETIME(value) FROM origin.smoke_autocorrect_kvp WHERE TOKEN(key) >= ? AND TOKEN(key) <= ? ALLOW FILTERING
2023-12-15 21:51:34 INFO  [] DiffJobSession:102 - CQL -- target select: SELECT key,value FROM target.smoke_autocorrect_kvp WHERE key=?
2023-12-15 21:51:34 INFO  [] DiffJobSession:103 - CQL -- target upsert: INSERT INTO target.smoke_autocorrect_kvp (key,value) VALUES (?,?) USING TTL ? AND TIMESTAMP ?
2023-12-15 21:51:34 INFO  [-9223372036854775808: 9223372036854775807] DiffJobSession:113 - ThreadID: 67 Processing min: -9223372036854775808 max: 9223372036854775807
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key1]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key1], ttl=0, writeTimestamp=1702677059832149, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key3]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key3], ttl=0, writeTimestamp=1702677059835401, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] PKFactory:99 - getTargetPK: newValues: [key2]; WritetimeTTL{loaded:true/valid:true/enabled:true}; explodeMapTargetKeyColumnIndex=-1
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:191 - getLargestWriteTimeStamp: customWritetime=0, writetimeSelectColumnIndexes=[3]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] WritetimeTTL:202 - getLargestTTL: ttlSelectColumnIndexes=[2]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] EnhancedPK:44 - EnhancedPK: values=[key2], ttl=0, writeTimestamp=1702677059833807, explodeMapKey=null, explodeMapValue=null
2023-12-15 21:51:34 DEBUG [[key1]:value] DiffJobSession:260 - Diff PK [key1], target/origin index: 1/1 target/origin column: value/value target/origin value: valueA/valueA
2023-12-15 21:51:34 DEBUG [[key1]:key] DiffJobSession:260 - Diff PK [key1], target/origin index: 0/0 target/origin column: key/key target/origin value: key1/key1
2023-12-15 21:51:34 DEBUG [[key3]:key] DiffJobSession:260 - Diff PK [key3], target/origin index: 0/0 target/origin column: key/key target/origin value: key3/key3
2023-12-15 21:51:34 DEBUG [[key3]:value] DiffJobSession:260 - Diff PK [key3], target/origin index: 1/1 target/origin column: value/value target/origin value: value999/valueC
2023-12-15 21:51:34 ERROR [-9223372036854775808: 9223372036854775807] DiffJobSession:213 - Mismatch row found for key: [key3] Mismatch: Target column:value-origin[valueC]-target[value999]; 
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] TargetInsertStatement:56 - bind using conversions: [CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, null, null]
2023-12-15 21:51:34 ERROR [-9223372036854775808: 9223372036854775807] DiffJobSession:219 - Corrected mismatch row in target: [key3]
2023-12-15 21:51:34 ERROR [-9223372036854775808: 9223372036854775807] DiffJobSession:194 - Missing target row found for key: [key2]
2023-12-15 21:51:34 DEBUG [-9223372036854775808: 9223372036854775807] TargetInsertStatement:56 - bind using conversions: [CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, CqlData{fromDataTypeList=[TEXT], toDataTypeList=[TEXT], conversionTypeList=[NONE]}, null, null]
2023-12-15 21:51:34 ERROR [-9223372036854775808: 9223372036854775807] DiffJobSession:205 - Inserted missing row in target: [key2]
2023-12-15 21:51:34 INFO  [] JobCounter:179 - ################################################################################################
2023-12-15 21:51:34 INFO  [] JobCounter:180 - Final Read Record Count: 3
2023-12-15 21:51:34 INFO  [] JobCounter:181 - Final Mismatch Record Count: 1
2023-12-15 21:51:34 INFO  [] JobCounter:182 - Final Corrected Mismatch Record Count: 1
2023-12-15 21:51:34 INFO  [] JobCounter:183 - Final Missing Record Count: 1
2023-12-15 21:51:34 INFO  [] JobCounter:184 - Final Corrected Missing Record Count: 1
2023-12-15 21:51:34 INFO  [] JobCounter:185 - Final Valid Record Count: 1
2023-12-15 21:51:34 INFO  [] JobCounter:186 - Final Skipped Record Count: 0
2023-12-15 21:51:34 INFO  [] JobCounter:190 - ################################################################################################
2023-12-15 21:51:34 INFO  [] DiffData$:139 - ################################################################################################
2023-12-15 21:51:34 INFO  [] DiffData$:140 - ###                              Data Validation Job - Stopped                               ###
2023-12-15 21:51:34 INFO  [] DiffData$:141 - ################################################################################################

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

% docker exec cdm-sit-cass cqlsh -e "select * from origin.smoke_autocorrect_kvp;select * from target.smoke_autocorrect_kvp;" 

 key  | value
------+--------
 key1 | valueA
 key3 | valueC
 key2 | valueB

(3 rows)

 key  | value
------+----------
 key1 |   valueA
 key3 | value999

(2 rows)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think this is a bug in the 4.x release on how it handles witetimes for default runs. The last fix, which handles the default writetimes correctly, breaks it for counter tables. We likely need to fix both (which was working in 3.x)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After the latest commit, all smoke, regression & feature tests should work consistently along with the writetime/TTL feature correctly.

@msmygit msmygit force-pushed the issue/build-stability-fixes branch from b83de4f to 15878dd Compare December 15, 2023 21:51
Copy link
Collaborator

@pravinbhat pravinbhat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving as it will make the build more stable although it does not resolve all issues.
We should ideally merge this as its better than the current state.

@@ -117,7 +117,6 @@ public boolean initializeAndValidate(CqlTable originTable, CqlTable targetTable)
logger.error("Counter table cannot specify TTL or WriteTimestamp columns as they cannot set on write");
isValid = false;
isEnabled = false;
return false;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think this is a bug in the 4.x release on how it handles witetimes for default runs. The last fix, which handles the default writetimes correctly, breaks it for counter tables. We likely need to fix both (which was working in 3.x)

@pravinbhat pravinbhat marked this pull request as ready for review December 16, 2023 05:59
@pravinbhat pravinbhat requested a review from a team as a code owner December 16, 2023 05:59
Copy link
Collaborator

@pravinbhat pravinbhat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After the latest commit, all smoke, regression & feature tests should work consistently along with the writetime/TTL feature correctly.

@@ -117,7 +117,6 @@ public boolean initializeAndValidate(CqlTable originTable, CqlTable targetTable)
logger.error("Counter table cannot specify TTL or WriteTimestamp columns as they cannot set on write");
isValid = false;
isEnabled = false;
return false;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After the latest commit, all smoke, regression & feature tests should work consistently along with the writetime/TTL feature correctly.

@pravinbhat pravinbhat merged commit 025b73e into main Dec 18, 2023
2 checks passed
@msmygit msmygit deleted the issue/build-stability-fixes branch December 18, 2023 14:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants