If your public servers are all publicly reachable, your servers are safe. If you are worried about your infrastructure being compromised, don’t run your public servers through your ISP, run your public servers through your network administrator.
But even if your servers are not reachable from the outside world, just knowing that your servers are publicly reachable is always a good thing. When someone is able to get on your servers, like they did with the ddos attack on reddit, then the servers you care about are still there.
In the video, the developers of the game explain that the public servers are not public and that anyone who is able to reach them is capable of performing a ddos attack. That means that even if these servers are not reachable from the outside world, and you know them as public reachable, you still care about them. Like, you care about Reddit, so you keep your subreddits online and you don’t block the subreddits you like.
And as a result, people care about them. It’s like a reverse-DDoS attack. Since Reddit is public, you have to take care of it.
The same goes for Facebook. Like, you’re on Facebook, you have to take care of it. Similarly with Google, they do the same thing. They give you a bunch of features to make it easier for you to do your work, but you still have to pay for them.
I think this is a big missed opportunity. Google, Facebook, Reddit, and other “social networks” were built to let people collaborate, to make free information available to everyone, not to make it impossible for people to collaborate. I think it would have been great if the big tech companies had started focusing much more on making the information “public” so people could collaborate, or at least take control of what they got.
I see this sort of thing happening all the time with public search engines. I think that it’s great that the big tech companies are starting to pay attention to the “social” aspect of their websites. This is something that Google wants to do, but they’re not doing very well at it. When you search for something, it’s not like you just type in your query and it magically shows your search results. You have to actually go and type those queries into the search box.
I know I’m in a minority on this, but I think that something that Google and other search engines do is make sure their crawler sees your site. This means that they see something on your site that they think is important, and that they can’t ignore. In other words, they see your site as a potential entry point into the search engine results.
This is usually the only search result you see when you type in a search term. I mean, it’s not like people just put in the first few letters of their name and it magically shows you the results page. But there’s a reason why it works like that. The search engine crawler usually ignores the first few letters of your query. It may show something like “This site has no pages. Your search did not find any results.
I know this might sound weird, but the reason it’s so hard for a search engine to find your site is because there are so many different sites competing for the same query. It’s like a “sneak attack” on your site. If search engines (including Google) knew what sites you were on, you could theoretically have more chances to rank higher in search.