Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We have a ton of security features to limit abuse of the system. A site that cannot handle the impact of a single basic (free) test can be DOS:ed once, yes. But we have a limit on how often we will generate load to (or from, really) individual IP addresses, so the second or third time in a day someone tries to "load test" the same site, they will be denied their test. This means you can at most DOS-attack a small site a few minutes per day. We hope it will be enough to make would-be abusers go somewhere else - i.e. not worth their time.


You may want to put a per-domain-per-day limit in there somewhere, otherwise you are going to have a very stressful day the first time someone on SomethingAwful hears of this wonderful new toy.

(Its a DDOS in a can if you have a few dozen people capable of reading directions.)


Or simply add a way to claim a server by uploading a specific file to verify ownership.

You probably don't want to do this because it would make the process much harder, but you can allow it in reverse; claim to deny load tests for a domain.


We actually have that feature also :-) Although it doesn't kick in until you try and run a test with higher load levels. Then you need to have a "loadimpact.txt" file in the root of your server, or the test won't start.


Like others have suggested, please require that before any testing. I like the concept of your site, but I really don't want people DOS'ing my site for any amount of time, even a few minutes. I know they can do it manually, but if you give a nice easy interface to help even noobs then that makes me a little scared of your site.

I'm guessing that you want to give a "live demo" to people, but I'm sure you could just show a demo report of a couple sites that have already been tested.


We need to be more open about what security measures we have built into the service, I think. I can tell you right now that the free tests most people are running have a built-in cutoff feature that aborts the test if it detects that the server on the other end is starting to respond significantly slower than it did at the start of the test.

Example: the test starts by simulating 10 clients. The average page load time measured is 500 ms. The test then moves on to test using 20 clients. If the average page load time goes above 1 second (twice what it was originally), the test will be aborted.

We have a general security philosophy that is based on different levels of trust. An anonymous user is trusted the least, a registered but non-paying user is trusted a little bit more, and a paying user is trusted even more. The level of trust determines how often you can run tests, how much data your tests are allowed to transfer, how often you can place load on individual destination servers (IPs), what settings you can make for your tests, etc.


Thanks for the reply, but I still think there's a huge disconnect.

First of all, you're admitting that it's okay for anonymous people to inflict a significant, measurable delay in page load time for my entire website. That is completely unacceptable.

Also, while I'm sure your system is well designed, it probably isn't perfect (because nothing is). If your monitoring service malfunctions then the tests might never cut off.

There should only be two levels of trust: people who own the server (and can test it), and people who don't own the server (and can do nothing to it).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: