Bug in OpenSSL stream socket filter which causes memory leaks

| 0 Comments
There's a bug in the OpenSSL stream socket filter which results in I/O buffers being leaked on both inbound and outbound data flow. This causes the memory used by the client or server to grow throughout the life of the process. The bug has been present since release 6.2 when the structure of the filtering code was redesigned.

Note that due to the fact that the filtering code was all redone at that time I expect that the same bug is likely to be present in the SChannel, SSPI and compression filters.

I've fixed the problem in the OpenSSL code and will be running this through my test harnesses over the next few days. If you want the fix now then please get in touch, if you can wait then I expect 6.3.1 will be released next week.

This shows that my pre release testing could still do with some work, automated unit tests and server stress tests can show that the code works but don't, in themselves, necessarily catch everything and of course they're only as good as the test coverage in any case. I discovered this issue whilst building some new performance tests which use perfmon to monitor the process under test and produce reports at the end. I guess I now have to extend the scope of this project so that I can monitor expected max and min values from the counters I'm monitoring - so I can see if the memory is ballooning during my automated tests...

Leave a comment

Follow us on Twitter: @ServerFramework

About this Entry

How to support 10,000 or more concurrent TCP connections - Part 2 - Perf tests from Day 0 was the previous entry in this blog.

Latest release of The Server Framework: 6.3.1 is the next entry in this blog.

I usually write about the development of The Server Framework, a super scalable, high performance, C++, I/O Completion Port based framework for writing servers and clients on Windows platforms.

Find recent content on the main index or look in the archives to find all content.