I was recently curious about what sort of SSL/TLS connections were being used to access our Exchange cluster via. IIS. We have systems spanning twenty years: from Windows XP all the way to Windows Server 2019. Who knows what “solutions” developers or power users have come up with over the years to integrate with our mail system?
Previously, I’d followed MS guidance in Enabling TLS 1.2 and New IIS Functionality to Help Identify Weak TLS Usage. The IIS blog explains how to set 4 server variables in IIS to track security protocols and cipher suites:
- CRYPT_PROTOCOL
- CRYPT_CIPHER_ALG_ID
- CRYPT_HASH_ALG_ID
- CRYPT_KEYEXCHANGE_ALG_ID
First, I needed to figure out what was actually being recorded by the logs.
#Date: 2021-03-04 00:00:03
#Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) cs(Referer) sc-status sc-substatus sc-win32-status time-taken crypt-protocol crypt-cipher crypt-hash crypt-keyexchange OriginalClientIP
2021-03-03 23:59:55 10.10.1.103 POST /EWS/Exchange.asmx &CorrelationID=<empty>;&cafeReqId=7a4501f0-3d61-425d-8d1f-0c3d5c9e6d13; 443 DOMAIN\webmail 10.10.1.58 ExchangeServicesClient/15.00.0913.015 - 200 0 0 38 400 660e 800c ae06 -
2021-03-03 23:59:55 10.10.1.103 POST /EWS/Exchange.asmx &CorrelationID=<empty>;&cafeReqId=42acb956-ea0a-4a32-a097-0bc73ae92860; 443 - 10.10.12.120 ExchangeServicesClient/15.00.0913.015 - 401 1 2148074254 1 400 660e 800c ae06 -
2021-03-03 23:59:55 10.10.1.103 POST /EWS/Exchange.asmx &CorrelationID=<empty>;&cafeReqId=0e312297-90fc-4b4d-8a28-24c3c60bcb7e; 443 - 10.10.12.120 ExchangeServicesClient/15.00.0913.015 - 401 1 2148074254 2 400 660e 800c ae06 -
The information I needed was in the 15th field (crypt-protocol). A value of ‘400’ correlates to TLS 1.2, ‘40’ represents TLS 1.0, ‘10’ represents SSLv3 and so forth. There’s Microsoft documentation that explains the various values.
So, let’s write some sloppy proof-of-concept code to parse the IIS log files on 1 server. We can then wrap it in a foreach
statement, or something like that, to query all the servers once it’s working. Let’s do some rough calculations on how long this will take by wrapping it in a measure-command
:
$Protocol = @()
Measure-Command {
Import-Csv c:\inetpub\logs\LogFiles\W3SVC1\u_ex210713_x.log -Delimiter ' ' -Header 'date', 'time', s-ip, cs-method, cs-uri-stem, cs-uri-query, s-port, cs-username, c-ip, 'cs(user-Agent)', 'cs(Referer)', sc-status, sc-substatus, sc-win32-status, time-taken, crypt-protocol, crypt-cipher, crypt-hash, crypt-keyexchange, OriginalClientIP | ForEach-Object { $Protocol += $_.'crypt-protocol' }
}
"$((Get-Content C:\inetpub\logs\LogFiles\W3SVC1\u_ex210713_x.log).count) lines processed."
Days : 0
Hours : 0
Minutes : 17
Seconds : 1
Milliseconds : 253
Ticks : 10212530494
TotalDays : 0.0118200584421296
TotalHours : 0.283681402611111
TotalMinutes : 17.0208841566667
TotalSeconds : 1021.2530494
TotalMilliseconds : 1021253.0494
This command took 17 minutes to run against one file on one server! I have 10 log files on each server (I wrote a separate script to automatically purge files older than that in order to save disk space). Aside from this script seeming ridiculously slow, it’ll be a whole day before my curiosity is satisfied - this simply won’t do!
Instead of the slow process of reading the whole file into memory, enumerating the columns and adding data to an array that we’re constantly resizing, let’s use a faster method. I’ll read in 1000 rows at a time, split each row to find the 15th column, exclude irrelevant information, and add those results to an array.
Measure-Command {
$Protocol = @()
$nlines = 0
Get-Content C:\inetpub\logs\LogFiles\W3SVC1\u_ex210713_x.log -ReadCount 1000 | ForEach-Object {
ForEach ($line in $_) {
If ($line.Split(' ')[15] -ne "400" -and $line.Split(' ')[15] -ne "-" -and $line.Split(' ')[15] -ne "time-taken" -and $line.split(' ')[8] -ne "::1" -and $line.split(' ')[8] -ne "127.0.0.1") {
$Protocol += $line.Split(' ')[8]
}
}
$nlines += $_.Length
}
}
$Protocol | Group-Object
[string]::Format("{1} lines", $nlines)
}
Days : 0
Hours : 0
Minutes : 12
Seconds : 14
Milliseconds : 149
Ticks : 7341497722
TotalDays : 0.00849710384490741
TotalHours : 0.203930492277778
TotalMinutes : 12.2358295366667
TotalSeconds : 734.1497722
TotalMilliseconds : 734149.7722
Well, 12 minutes is somewhat better - a 30% improvement - but still not good enough. Let’s get rid of the two tremendously slow array additions, which even activate in conjunction, slowing things down further. Arrays are fixed sizes. When you use the +=
operator, PowerShell actually creates a new array with the values of the original array plus the added value. Each time I add a value, the entire existing array is replicated into a new array with one more value. It’s terribly inefficient when you have large amounts of data like the 129,102 entries in this file.
Let’s use a .NET method to fix this speed issue. An ArrayList with a .Add() method is far more efficient for what I’m doing.
Measure-Command {
$Protocol = [System.Collections.ArrayList]@()
Get-Content C:\inetpub\logs\LogFiles\W3SVC1\u_ex210713_x.log -ReadCount 1000 | ForEach-Object {
If ($_.Split(' ')[15] -ne "400" -and $_.Split(' ')[15] -ne "-" -and $_.Split(' ')[15] -ne "time-taken" -and $_.split(' ')[8] -ne "::1" -and $_.split(' ')[8] -ne "127.0.0.1") {
$null = $Protocol.Add($_.Split(' ')[8])
}
}
}
Days : 0
Hours : 0
Minutes : 0
Seconds : 8
Milliseconds : 314
Ticks : 83145768
TotalDays : 9.62335277777778E-05
TotalHours : 0.00230960466666667
TotalMinutes : 0.13857628
TotalSeconds : 8.3145768
TotalMilliseconds : 8314.5768
A 12180% speed increase from my first code is much better. Now let’s run this against all 10 files in the directory, then dump the statistics I’m interested in:
Measure-Command {
$Protocol = [System.Collections.ArrayList]@()
Get-Content C:\W3SVC1\*.log -ReadCount 1000 | ForEach-Object {
If ($_.Split(' ')[15] -ne "400" -and $_.Split(' ')[15] -ne "-" -and $_.Split(' ')[15] -ne "time-taken" -and $_.split(' ')[8] -ne "::1" -and $_.split(' ')[8] -ne "127.0.0.1") {
$null = $Protocol.Add($_.Split(' ')[8])
}
}
}
$Protocol | Group-Object | Select-Object Count, Name
"$($Protocol.count) matching entries in $((Get-Content C:\W3SVC1\*.log).count) processed lines."
Days : 0
Hours : 0
Minutes : 1
Seconds : 8
Milliseconds : 7
Ticks : 680073775
TotalDays : 0.000787122424768519
TotalHours : 0.0188909381944444
TotalMinutes : 1.13345629166667
TotalSeconds : 68.0073775
TotalMilliseconds : 68007.3775
PS C:\> $Protocol | Group-Object | Select-Object Count, Name
Count Name
----- ----
4 10.1.0.32
1340 10.1.1.250
8 169.254.3.178
85 172.16.0.12
PS C:\> "$($Protocol.count) matching entries in $((Get-Content C:\W3SVC1\*.log).count) processed lines."
1445 matching entries in 1589761 processed lines.
This code for all 10 files is 1400% faster than the code I started with to read a single file. There are 4 IPs connecting to Exchange using something other than TLS 1.2, but they’re all private IPs for the Exchange cluster itself, or the replication network.