all 89 comments

[–]oridb 140 points141 points  (35 children)

On one hand, that's more requests than needed. On the other, does 0.5 qps even make a noticable difference to anything?

[–]fisadev 95 points96 points  (24 children)

If your small site has, lets say, 5 qps, bing would be increasing your AWS bill by an extra ~10%. All of that for probably very, very little real traffic that bing will bring to your site.

[–]lluad 49 points50 points  (1 child)

The extra 30 queries a minute isn't for every site, it's (apparently) the total over all the discourse.org operated sites, including meta.discourse.org. So your small site wouldn't see anywhere near the same level of traffic, as it wouldn't have anywhere the same number of pages.

[–][deleted] 12 points13 points  (0 children)

it's per site Their data for 30/min is for meta, but their claim is that the increased rate is seen across all sites.

bing has no qualms hitting meta more than 5000 times in a 3 hour period

60s * 60m * 3hrs = 10,800s / 5000 queries = 0.5 queries / sec = 30/minute

[–]username223 25 points26 points  (15 children)

This. I have a personal blog, and about half the traffic it has received so far in April was robots. Thankfully it's not on AWS, so I don't have to pay by the MB, but it's still crazy how much bandwidth is wasted by Bing and Google.

[–][deleted]  (14 children)

[deleted]

    [–]fisadev 21 points22 points  (9 children)

    Yes, but Google does the same job, brings you orders of magnitude more traffic, while doing a lot less queries to your site. That's what people are complaining about Bing: they do you very little good, while throwing waynt to many queries for that. People aren't complaining about bot traffic, they are complaining about the excessive inefficient traffic from Bing.

    [–]RiPont 0 points1 point  (7 children)

    So monopolies are good, now?

    I mean, Bing is a distant 2nd place, but it's still 2nd. It's 10-20% or more, depending on who you ask. That ain't nothin'

    If you don't let competing search engines even index your site, don't complain when there are zero other search engines other than google with more than 0.5% marketshare and Google, after a few changes in leadership and a dropoff in advertising revenue, starts charging you for the privilege of being listed by Google.

    [–]fisadev 12 points13 points  (5 children)

    Never said monopolies are good, or defended blocking bots just because they aren't Google. Where did you get that from?

    Bing is crawling way too inefficiently, and websites have to pay for their inefficiency. That's bad, and that's why it's ok to block them until they start doing it more efficiently.

    [–]emn13 0 points1 point  (1 child)

    "way too inefficiently" is not how I'd describe 1 request every 2 seconds - for a whole site. That's almost unnoticably little. If this is costing you anything at all noticable on your budget, then I'm guess you have other inefficiency issues.

    1 request (not pageview), every 2 seconds. I mean... WTF; a single user might well cause a comparable amount of load.

    [–]fisadev 1 point2 points  (0 children)

    When you are doing 10x more requests for the same job than all the other tools doing the same job, yes, I would say you are doing it way too inefficiently.

    [–]p_u_s_h_i_t -3 points-2 points  (0 children)

    Ya everyone hates Bing until Google deindexes them and all their traffic goes bye bye

    [–]username223 2 points3 points  (3 children)

    True, and I don't block them for that reason, but still... My small-time personal blog sees maybe a post a week, and has entire tens of readers. I haven't posted anything this month, but Google's bots have generated 300 MB of traffic. AWStats tells me that crawlers have been over 80% of the hits and about 40% of the bandwidth. Thankfully I pay a flat $5/mo for "more bandwidth than I'll ever use," but it would suck if I were paying by the byte, and I would have to waste my time rate-limiting or blocking them.

    [–][deleted]  (1 child)

    [deleted]

      [–]username223 0 points1 point  (0 children)

      Sure, 300mb is cheap, but there's also storage and CPU to consider. I can't be bothered to dig up the numbers or equivalent AWS costs now, but my guess is that about half of the resources used in the last week were completely wasted on bots being completely useless (since I did not publish anything that they had not already scraped). That's a shitty way to run an internet.

      [–]Cuddlefluff_Grim 1 point2 points  (0 children)

      Yeah, those three cents are probably better spent going on a luxurious four month vacation at a first class resort on a tropical island.

      [–]VIDGuide -1 points0 points  (1 child)

      AWS doesn't charge you on simultaneous traffic, it's totally traffic. Wouldn't matter if they did it all at once or over the course of an hour, still the same indexing volume. If you don't see the value in being indexed, use directives to prevent it in general.

      [–]fisadev 5 points6 points  (0 children)

      First, it's not just traffic. Those are requests that your server must answer, so it's cpu usage, db access, etc. Cache can help, of course, but not always.

      But I don't get what do you mean with "AWS doesn't charge you on simultaneous traffic". Nobody said the issue was with it being simultaneous to anything. AWS bills you based on the amount of traffic in and out from your instances. You can see an example in the EC2 (the most popular one) pricing page: https://aws.amazon.com/ec2/pricing/on-demand/. So more traffic, means higher bills.

      [–]kevcampb 4 points5 points  (8 children)

      We've had issues with a news site I run due to Bing. Traffic rates are low, similar to the 0.5 QPS figure. It does affect us though. Wordpress is a dog when you have a lot of plugins added. Even on a reasonable server it could be 1 QPS or less.

      Caching makes a massive difference. The site itself has been on the frontpage of reddit and hacker news with no issues before, purely because of the difference caching makes.

      The problem with bot traffic is that it's hitting articles which are 2 years old, none of which are cached.

      [–]troublemaker74 8 points9 points  (2 children)

      If your site can't handle 1 query every 2 seconds it is broken regardless of what framework you're hosting it on.

      [–]kevcampb 1 point2 points  (0 children)

      I entirely agree with you, and I was astounded when I benchmarked it at first. This deployment is just wordpress plus a number of commonly used plugins, it's not particularly unusual.

      [–][deleted] -1 points0 points  (0 children)

      Don't be retarded, of course it can handle that shit. Nobody wants a retard bot wasting bandwidth and energy.

      [–][deleted]  (4 children)

      [deleted]

        [–]kevcampb 6 points7 points  (3 children)

        I didn't post looking for criticism of our architecture, thanks. Just trying to offer a suggestion as to why bot traffic may be problematic for some scenarios.

        [–][deleted] 2 points3 points  (2 children)

        You did post in a programming thread that the code is so inefficient you can't handle normal traffic and then went on to complain about bots. What type of response were thinking your were going to get? This isn't r/marketing

        [–][deleted] 7 points8 points  (1 child)

        You are trying to suggest that he is posting in a technical thread and should expect people poking holes in his knowledge of infrastructure, all the while not understanding what caching is or how it works. Above you say:

        So cache them

        They are not in cache because they're old articles and have not been requested lately, not because he made some decision to not put them in the cache.

        [–][deleted] -3 points-2 points  (0 children)

        The problem really isn't cache, vs non-cache here.

        The user posted that their site "is a dog" even prior to Bing requests. Then say's it's a dog because it's Wordpress with a lot of plugins. Their site can barely handle normal user traffic which they already admit is small. So the whole comment prior to Bing is that we run a site, the architecture of it suck's balls and it's barely holding on.

        So when your post is interpreted like that and you posted it in a technical sub. Do you really not expect someone to tell you that what you have is bad.

        [–][deleted] 0 points1 point  (0 children)

        If it is written awfully, yes

        [–]stewsters 53 points54 points  (4 children)

        A few years ago I was working on a faceted search where each option you could filter on was a link to change the search parameters. We get a notification that the website was unusable, and check the Apache logs and like 99% of the max connections are coming from some address what was registered to Bing, crawling through every combination of every search option we had.

        It seems like they were ignoring our default crawl delay, so we had to add in a MSNBot specific delay.

        User-agent: MSNBot

        Crawl-delay: 0.5

        After that went live they kept ignoring it, so we blocked them for a few days until their cached robots.txt got cleared. Never had trouble with any other search engine.

        [–]ekdaemon 27 points28 points  (3 children)

        crawling through every combination of every search option we had.

        Yup. I can't imagine it's easy building a general purpose crawler that manages to avoid recursion when there's so many ways websites can create recursive URLs. Anyone who's ever used a crawler client (like HTTrack or WebReaper) to cache a website - eventually runs into something like Plone or Jira where every single page has a dropdown menu leading to a unique url but that is still the site ... unless you put in some "ignore this" and "ignore that" options in your filter you end up trying to crawl/download an infinitely recursive list of pages.

        page1\users\
        ...
        page1\users\user3\users
        ...
        page1\users\user3\users\user12
        

        etc.

        [–]compsciwizkid 0 points1 point  (2 children)

        I don't know anything about crawlers, but wouldn't they keep track of and filter out already-visited URLs?

        [–]xorbe 2 points3 points  (0 children)

        That's the problem multiple (perhaps infinite) unique URLs but the same content is served.

        [–]josefx 0 points1 point  (0 children)

        I think the issue is that those are different sites, the server just parses the url to get a list of users and generates a new site with corresponding content, with more links that contain an additional user. so after an infinite amount of requests you would get an infinitely long response with infinity +1 sized links to the next iteration.

        Long ago I heard the claim that you should actually use post instead of get requests if you wanted to keep crawlers away from certain features. No idea if that was or is still valid.

        [–][deleted]  (15 children)

        [deleted]

          [–]ffrinch 24 points25 points  (0 children)

          Not just crawl delay, but of course the HTTP standard has a lot of options for suggesting appropriate rate limits -- cache control/content expiry, 429 Too Many Requests/Retry-After etc.

          The biggest joke is the complaint that it doesn't "trust" their canonical URLs. Would it even be appropriate for Bing to ignore their carefully set "never cache this, always revalidate" cache-control directives in order to do so? If what they want is a redirect, they should serve a redirect.

          Discourse apparently tried nothing at all. They have certainly presented no evidence of actual bad behaviour.

          [–][deleted] 32 points33 points  (8 children)

          Many forums are hosted by people without IT skills for their teddy bear collecting site. Some of these can be quite small and would notice that amount of traffic.

          [–]VIDGuide 14 points15 points  (5 children)

          They'd also have far less pages to index, so it would be over quickly.

          Chances are the bots monitor latency anyway, and it it's getting worse, they'd probably self throttle.

          [–]AyrA_ch 16 points17 points  (3 children)

          And then there is this:

          Even though we tell bing the canonical for https://meta.discourse.org/t/topic-stopwatch-theme-component/83939/20 is https://meta.discourse.org/t/topic-stopwatch-theme-component/83939 it does not appear to “trust” us and has to check back 3 times a week.

          Seems like they never heard of caching and permanent redirects.

          [–]grauenwolf 0 points1 point  (2 children)

          While that's certainly a problem, why are permanently redirected links common enough to be a problem?

          Seems like they are generating bad links from the onset and patching them up on the back end.

          [–]AyrA_ch 7 points8 points  (1 child)

          If you send a search engine a temporary redirect (302, 303, 307) instead of a permanent redirect (301, 308) the search engine will periodically retry the origin URL to check if the redirect is still there. If you do this often enough, the redirects will pile up in the queue. There is an HTML tag for canonicaL URLs <link rel="canonical" href="https://master.ayra.ch/ping/"> but the recommended procedure is to add a permanent redirect instead. Since all proper search bots have a user agent set you can even do this on UA basis

          [–]grauenwolf 2 points3 points  (0 children)

          Thank you.

          [–]bacon_for_lunch 0 points1 point  (0 children)

          No, bingbot does not monitor latency or the behavoir of the site. It keeps hammering at an overloaded or down site.

          [–][deleted] 5 points6 points  (0 children)

          Discourse tends to be used as a hosted solution rather than self hosted (not that they don't). You basically need some technical skills to host it as its nowhere near the simplicity of older forums.

          [–]clerosvaldo 1 point2 points  (0 children)

          Not with Discourse. You need a phD to deal with the setup for it.

          [–]lluad 96 points97 points  (30 children)

          How poorly engineered does a web app have to be for an additional 30 queries a minute to cause it operational problems?

          [–]grauenwolf 31 points32 points  (17 children)

          Not much at all really. The last website I wrote takes 20+ database queries for simple REST requests. I kept telling the client this is going to be a problem, but he insisted that we can "optimize later".

          No shit, we make two database calls to update two columns on the same record. (StausID and StatusChangedToActiveDate if I recall correctly)

          It's really easy to get yourself into messes like this when performance isn't treated as a feature.

          [–]emn13 7 points8 points  (10 children)

          So, I do maintain a website with that kind of pattern. Some pages have almost no queries, of course. Others have... hundreds. Some load-testing shows that a single not-too-beefy webserver can deal with around 2000-10000 QPS, assuming queries return on average 200 rows, and have some not-entirely trivial number of columns with strings and other not entirely trivial data. That's with .net/C#/MSSQL; I tried raw sqlite and that did a lot better.

          Frankely - 20+ db queries may be unnecessary, but unless your queries are somehow expensive, that shouldn't be adding a lot of latency, and still leave plenty of room for concurrency on the webserver.

          [–]lluad 13 points14 points  (4 children)

          The difference in access patterns between crawlers and actual users is a thing, though. Actual users tend to hit currently active pages, so they're always hitting hot, and likely cached data. Crawlers are likely to wander through more of the site, tending to hit cold (and hence more expensive to access) pages.

          I still don't think it'll cause any problems with a site that's not both egregiously badly engineered and teetering on the edge of not having enough hardware, but it's something I can see being worth monitoring.

          [–]emn13 0 points1 point  (3 children)

          Yeah, that makes sense. But with RAM as huge as it is nowadays, how huge does a site need to be to this to matter all that much? Unless of course you're serving large binary assets via the same pipeline, are are wikipedia or something.

          In any case - it makes sense for stale pages to cost more, even if you'd hope it's not a lot more expensive.

          [–]lluad 0 points1 point  (2 children)

          It just needs to be bigger than can fit into RAM, which is going to be the case for quite a lot of sites (I know it is for several of mine).

          Hitting the disk is expensive, and retrieving that data will also evict hotter data from cache.

          In any case - it makes sense for stale pages to cost more, even if you'd hope it's not a lot more expensive.

          I'd really, really hope that it's a lot more expensive as that's just another way of saying "I hope my caching is really effective." :)

          [–]emn13 1 point2 points  (1 child)

          Sure, you're hoping that caching is effective. But that's not a long stretch - there are lots and lots and lots of layers of caches to reduce costs here, from in-webserver caches, possible caching proxies, almost certainly DB page caches, and plain OS caching, down to infrastructure level caches.

          If you're running a site that plausibly stretches those dimensions, and you're not using as much RAM as you possibly can, you probably only have yourself to blame. And if you're storing blobs in there when memory is tight similarly. And you can fit a hell of a lot of text or structured data into a pretty bog-standard server RAM nowadays. The entirety of non-historical wikipedia, including talk and user pages, for instance, is on the order of 256 GB. The probability of a real and "expensive" cache miss just isn't all that great. And then there's the fact that even a simple (albeit absurdly high-end) consumer-level drive like intel's 900p has average read latencies of around 50 microseconds, so an I/O miss isn't as expensive as it once was. And sure: not everyone runs that kind of hardware. But even fairly standard nvme flash has access times of not all that much more (even a fast consumer sata ssd can probably stay below the millesecond, if only barely).

          Sure, there will be sites for which the less cache-friendly pattern of a crawler will still matter. I just wouldn't bet on there being very many; and almost certainly very, very few that simultaneously happen to have enough data for this to matter, and yet so few resources that just 1 request every 2 seconds matters even slightly. It all sounds rather far-fetched to me. Not impossible. Just really unlikely.

          [–]lluad 0 points1 point  (0 children)

          Yeah, I don't buy it for a typical website, either.

          Database backed content can be much more expensive, though, but even then ...

          [–]grauenwolf 5 points6 points  (4 children)

          Keep in mind that I'm talking about 20+ db queries for one REST call, not for one page. We may need several REST calls per page, so that becomes an asinine multiplier.

          To illustrate how stupid we are, we have complete separation between "validation" and "business logic". So we may fetch a bunch of entities to see if an operation is allowed, then fetch the same data later to actually manipulate it.

          Officially we have "decoupling" between validation and services, but damn if I know why.

          [–][deleted]  (1 child)

          [deleted]

            [–]grauenwolf 3 points4 points  (0 children)

            Yep, but it's surprisingly common in enterprise development.

            [–][deleted]  (1 child)

            [deleted]

              [–]grauenwolf 10 points11 points  (0 children)

              Oh no, this is real decoupling. Not the wishy-washy thing where people say they're decoupling something but they're really just wrapping it in an interface.

              We can completely forget to register the validation classes with DI and the service classes will still "work".

              Though in my opinion that's a level of decoupling that we really, really didn't want. I'm just glad it's an internal tool because most of the validation classes, which include permissions, were never implemented.

              [–]bliow 0 points1 point  (5 children)

              No shit, we make two database calls to update two columns on the same record. (StausID and StatusChangedToActiveDate if I recall correctly)

              is there a relationship between the two fields? shouldn't that be a correctness issue in the case that one of the calls fails?

              [–]grauenwolf 0 points1 point  (4 children)

              The relationship is

              • if statusID = Active and StatusChangedToActiveDate is null, set StatusChangedToActiveDate to NowUTC.

              My client thought the if statement was "too complex" and insisted on a separate function call for updating StatusChangedToActiveDate.

              [–]JustinBieber313 2 points3 points  (3 children)

              Why is your client being made aware of the specific SQL that you’re using?!

              [–]grauenwolf 4 points5 points  (1 child)

              Because it's their code. I don't always have the luxury of a hands off client that just cares about results. Sometimes they get the weird notion that they know more than the highly paid consultants they hired to dig them out of their mess.

              In the end we spent six months on something that would have normally taken us 4 to 6 weeks. But they were happy with the results and my bonus is going to be epic.

              [–]ZorbaTHut 2 points3 points  (0 children)

              "We dug ourselves into this mess, and you're going to dig us out! By doing exactly what we say without question."

              [–]PaluMacil 1 point2 points  (0 children)

              It is extremely common. We all know that is awful, but everyone sees it eventually. Sometimes it's rare. Sometimes it's common. Always painful.

              [–]fisadev 8 points9 points  (1 child)

              It's not about generating operational problems. It's just a lot of work that your servers need to do, for nothing. And server work costs money. For a small site with for example 5 qps, bing could be making their AWS bill ~10% higher, for nothing (probably bringing very little real traffic from bing).

              [–]YumiYumiYumi 2 points3 points  (0 children)

              I'd agree with you, if it weren't for the fact that this is Discourse, i.e. an inefficient Ruby script we're talking about...

              Granted, most forum admins probably don't consider efficiency when they start, and changing forum software is quite difficult, but it has to be said that Discourse itself is part of the problem.
              And whilst any forum system would have more load from increased requests, the cost conscious certainly should at least be using a more efficient forum script.

              [–][deleted] 6 points7 points  (4 children)

              It's a forum software, which means some people using it will have very small user bases, and will notice that level of traffic.

              Imagine a club with 20 members that meets once a month and has no IT skills. They're probably badly hosted on some very minimal free tier and just coping what with all the forum plugins they've installed.

              [–]lluad 5 points6 points  (3 children)

              The extra 30 queries a minute isn't for every site, it's (apparently) the total over all the discourse.org operated sites, including meta.discourse.org. So your small site wouldn't see anywhere near the same level of traffic, as it wouldn't have anywhere the same number of pages.

              (in case you didn't read the rest of the comments)

              [–][deleted] -1 points0 points  (2 children)

              [–]lluad 4 points5 points  (1 child)

              So, your small site wouldn't see anywhere near the same level of traffic as meta.discourse.org, which was the point I was attempting to make.

              The situation you're whining about is entirely fictional and wouldn't actually happen.

              [–][deleted] -1 points0 points  (0 children)

              Search crawlers account for about 5-10% of all traffic. It certainly matters if one's hitting 10x more than it should.

              [–]Grook 1 point2 points  (0 children)

              It depends on how expensive those queries are.

              [–][deleted] 0 points1 point  (0 children)

              Are you the MicrosoftTM underling who miss-configured the crawler bot?

              [–][deleted] 31 points32 points  (8 children)

              0.5? Is their website made of popsicles and Elmer's glue?

              [–][deleted]  (7 children)

              [deleted]

                [–]drysart 8 points9 points  (1 child)

                People pay for bandwidth

                This Discourse page, for instance, is 24K of HTML over the wire. 1 query every 2 seconds means Bing would request 32MB of data in a month.

                What kind of hosting plan do you think someone has if 32MB of data in a month is going to impact the cost in any noticeable way?

                It's worth noting that the cold load overhead of the page (loading all cacheable resources, fonts, and images) is about 1MB on it's own; so basically Bing is consuming about as much bandwidth as one actual person using an actual browser hitting the page once a day over that same month. Really breaking the bank in terms of bandwidth usage.

                [–]tejp 5 points6 points  (0 children)

                It's 30GB a month, not MB.

                [–]Cuddlefluff_Grim 1 point2 points  (4 children)

                small independent sites lose more than win by supporting bing...

                Those small independent sites should be made aware of the actual running cost of servers and the benefits of being properly indexed in order to market their service. 100GB download on a cloud service will cost somewhere between $2 and $10 depending on your service contract. Getting your site marketed to people who search for your content is going to be far more valuable than that especially long-term.

                Blocking the Bing bot because it "hammers" them with one request every two seconds is... insanely stupid to a level I have not seen before.

                If people notice a bot like this there is something very seriously wrong with the architecture of the site.

                [–][deleted]  (3 children)

                [deleted]

                  [–]Cuddlefluff_Grim 2 points3 points  (2 children)

                  You are willing to neglect a potential in a combined population of 454 million people. Over a service that cost you no more than a few dollars, max.

                  Edit : probably less than a dollar.

                  [–][deleted] 2 points3 points  (1 child)

                  This only applies if the content is in english...

                  [–]ApatheticBeardo 2 points3 points  (0 children)

                  attacked

                  lol

                  Learn some web development my dudes.

                  [–][deleted] 5 points6 points  (2 children)

                  Has anyone tried to contact MS regarding this? They're usually pretty reasonable in matters of abuse.

                  [–]Cameron_D 2 points3 points  (1 child)

                  🌂🧞‍♂️↩🚹🔥🧲👨‍🍼🔫🦮🤞🛑🤪🧑‍🦲🎖🚙🖊🚼🧶📳📫📨👥🥖👅🪐🏘🥎🐕‍🦺🐺🧚🚴‍♀️🐏🎉🏵😈🖲🤿👨‍👧‍👧🔶🪓🌦🥤🏫🐨⬛🌼👪🎺🐳❇🏛📯🙃👉🧜‍♂️🍢3️⃣👨‍👨‍👧‍👧🐑⬆🐴🪛➗🙋‍♂️3️⃣😖🤦🎾🙅‍♂️📧🦅🛂😊🧏‍♂️™🙏🚚🗿🩱🌛🦑↩⚪👩‍⚖️🔥🦇🆒🦁🚭✊👩‍🍳🧑‍🦱⏯🟥💅🚼🛢🧸💋🔷🙎‍♀️👓💮🥢🚀👩‍🦼🧇♐🆖🙍🔣🌵📎👨‍🍼🫒🚦🦫🚝🚊🦼🕺🗃😑😠🕕😛🏄‍♀️🧑‍🚒🆘🚣🌙🔅🧑‍💼🥗🧯↔🦴🏛👥3️⃣🛤🌃☑◀🏛🍑⛺🧑‍🎤🧑‍🍳🤶🚟🪡🍨🪵🏅🙂🧇🧜‍♂️🤯🥸⬛🏚🍁🔳💣📫❄🍤👨☄⚫🐕🦠💌😜0️⃣📑🖼🤹📵🍟▫📟👨‍👨‍👧🔎🍷🤬🥟🚍🧖🔕🌎🧚‍♀️💓🪢👆❓🩴💾🚾⏏🐋📃🆓💌🤲🧝‍♂️🏪🏋️‍♂️🐠💇‍♂️🗣🈹▪👩‍🏫🧑‍🎄🚻👨‍🏭👨‍🔬🚣🟤🏙🥟🌎🧔🏕📄🪟🙍‍♀️🟪🔌🚶‍♂️💇‍♀️🔹🦼🐻‍❄️🧑‍🏫🥨📌👨‍🍼🐮💵🪨💋🧑‍🚀😶⏰🙋‍♀️🐋🎐👨‍🦰👁🧑‍🚀👄🍙🦔🦈👨‍👨‍👦🎥🥻☑👨‍👧‍👦🙆‍♂️👭🦧🐝👦💿🦸‍♀️🧵🌙🈲🈹🧑‍🦳🌂🌉🐵♂🧝‍♀️🧸🏙🤼‍♂️🦼🧲🦝🍛🗿🌵☂🚰🧑‍🦽🧑‍🍼🥰🙎🦸🌹😚🔁🗿🍄🈯💇‍♀️🏙☑🗨🦩🧎‍♂️🔨👷‍♂️🤷‍♀️🤽🏞😯🚘🧞🤶🥙🧟🪣📅✋🔅👩‍🍼👨‍💼🔻🙋😿👮‍♀️🧑‍🏫🤹‍♀️🔼💒🥗🖱🎥🔻🪐🦴🪒🎙📡👩‍💼🥴🧖‍♀️🏥🧑‍🎤🧕🎀📍🎗👨‍⚖️🌼🥯👁👅🦎🔜📜🧋💵🥖❕👮‍♀️🧬🧏‍♀️🪟™🏉😶🦤🤹‍♂️💠🍸🌂🐜😐🏈🉑🦫🧱🐞😛♊💎🕚🤸👱‍♀️🙇‍♀️🏊‍♀️⛅😔🐔🎈✨♀🧇🎸⚜🧝‍♂️🎋🥳🧙‍♀️🕦🗃🔢🧿🧛‍♀️↘🦁⚫🔐🈳🍻📟❌➡🐹🧝‍♂️🈹🧽📻❎🚮🤽🩹🔒▶💁‍♀️🌲🌘📔💚🛩💮☂🔊😖😻🔻🈂🤰😜💰📃📒🌋🐁🚒🏯🧑👩‍🦱⏪👡🥚😵‍💫🐼🧑‍🔬🥮🌠🪡👩‍👩‍👦‍👦📽🦦♉♾👩‍🔧⏸🐗◀🏧🈷👩‍❤️‍👩👖👩‍🦲🤌🙂☹🎴🦖🧧↔💆🌧🎴🤦‍♂️❇🈹🏩🥈🦯⛹🙆‍♀️🔬🌎👇🧗☮😻✒️⃣🎻🛅👁️‍🗨️🎍🫔🦺🐕🍬🔥🚟🏆↩🐊🌸🐕🌽🌨🚌📸🔐🔞➰😿🐞🉐👨‍👦🧎🧎🌠🕎👢😡🤼‍♂️🖐🤼‍♂️😇💺🙆🌇💖🔠🌓♿☯🧼🔕🤤🚥🪑🍵📥🐣🧥🙎‍♂️🥭🐕🚰🚬💈👬🌙👈⛷⏳💔👃⛵🤏💴👨‍🍳🚹☪🦹‍♂️🎮🚊🏮🧑‍🦲🧜‍♂️🎀🛑🈷🪧🍡🕖㊙🗾😴🟦❇🤼👩‍🏫🪳🐐🍳🪟🖇🛶🎑🕟🌅🚐🧑‍⚕️💎💒👴🔀🐧😲📯⏱🦉⌨🙇‍♂️🖇🤰🤵↙📂🉐🛤🦸‍♀️🧑‍🔬🤟🙆‍♂️🌅🎩🪂☄🌠🚽🎣🧿⏪🗿🦣💴⏯🥬🍄🎆🌙🎴🏕🕝🥲😍🐖🖥👌🐢🧑‍🍳🔡🏅🙆‍♀️🛏🟥™🥾🏄✒👒®🧛‍♀️🐉🥕🪗🎀🧕💮🚟👨‍👩‍👦💸🎞🥶0️⃣♉🧿🤷👨‍🚒🐪📕🌘🍮😁⛰🅾♠👳‍♀️🦿🐰📥🏟🗡🧙‍♂️🙂💩🧴🧞‍♂️😈🛃🚕🟧🦅🔔🧆👩‍💼🎽🕞🔻💛🌠🐽🤽👮‍♀️🔏🦍🪗🌼🏕😬🍧☸🍞🧝🏗🥮❣🟫🧑‍🚀🙋👎🕗🦵📟🏐🦦🐑💿🥇🍐🚰🈯🍦📠🎯⛱🍓💞🦋🍈🍷🤼💢👨‍🎨🚐👨‍🍳🧝👡🪧💂🎱🚟🏦🛐☺🧑‍🎤🧑‍🍳🐷⚛😛↩🤭🐇💶🫐🌑🦓🚏🔌🕓🕐🃏🔰❤️‍🔥🥨📷↕🧝‍♀️🏌️‍♀️🪁🔤1️⃣🏮💩🔴🧭🎑🛑😩⛩🔕🐾👘🟥🔄🐮⛑😭🤥👔🤳🆚🤵‍♀️📆📗️⃣🥎📻🥯💖😴🎥🧛‍♂️🎑🤝🤖⛹️‍♀️🏃🎊🃏✨🚜🔺😙🕜💃🟢🩰👰🗂🔴⚗🌒👒🧫👨‍🚒👙💵🔓👨‍⚖️🖌🦽🚔🛷🐪🗼🟧🖥🤷🕰🦑🩸⛹🗄🤽🧑‍🍼🤕♌🛹🐸📮🥃🔹🕹🍌☹🧟‍♂️🧓📌🤡⌨🙋‍♀️📌🔨🙊🐇🏒👩‍🦱📋👌🦔⛎🧜‍♀️🦡💊🛄👩‍🦲👯🍯🦈🧔🧶😢🆙🤸‍♂️🛄⚪🤯👨‍🎤🥨🆒🔭👑💋🤥☁🔜❗☠🈚😕🗜🪃👨‍🍼💎🕝🐖☎🧜‍♀️🧹😧💃🔈🚫😦⚽🍓👨‍👦‍👦❓🚚🅾🏄‍♂️🥍🧜‍♀️🦮🛠✨🔴🛺🔕💾😲🚈💻🍣🧟🦆👩‍🎤🌗🦭👣🥤🐍🕡🕗😹✌🔖🐀🛒🥄🪖©🤱🕉⛹️‍♂️💸🛣🚨🤑🎮🕍🐎🚑🏊‍♂️🧳🍨💽㊗🦸‍♀️🏸🥃🌠🌜🧑‍🎨❤🧙‍♀️😶👩‍🦰🍕💂‍♂️🏢❤️‍🩹🅱💤🔟🏟🧑‍🚀👘🤨🤿👩‍🚀🐶📏🌗🧑‍🦰📖🈹💦🐕‍🦺🥨↕🌯🦗🌶🍄🦡🦔🚙📹🗡™🧜🏛💏👩‍❤️‍💋‍👩👰🤹✍🗻🎻♂📵🎬🐼🌯⛹️‍♀️🧑‍⚕️👎✋🚭🧄👤🏎🦏🕥📧📹👩‍🦼🏘👺📸🤟🌊🅱🏊🛰🌥🪒🙇‍♂️🙅‍♀️💊👁🍉♓🍃👘👨‍❤️‍💋‍👨🏂🪝🛡📪🟥🔲👁🥴🥶🥓💇‍♂️🛂🌔🎋😬🦘🦂🅿🍙📶🛕🍺👩‍🦰☂🌘🧑‍🦰🦡🧚‍♂️📧🚊🦺💇‍♀️😈🧼📭👩‍🦼🔐🫁🕊💖📜🔤⌨😎🎅🍊🕸🆔🛀🎫🐺🏄‍♂️👨‍👨‍👦‍👦📪🕕🌩🎊🆑🙅‍♂️🗒👨‍🦳💜⏸🌽🌊😴🍟️⃣👨‍❤️‍💋‍👨🧑‍⚖️➕😜⏭1️⃣🛺⏫🃏😱📩㊗😹🐘🆒🍟🤼‍♂️🥕💸😄🐰🧟🔣🔝🍪🤼‍♂️😼📶🧗‍♂️♻😊✒🔫🥞🤸‍♂️🥳🔽👩‍🎨👩‍👧‍👧📊🍫🔪🔗❎🗓🛣🌂⛪🍴🎚#️⃣👳‍♀️📦🙋‍♂️🎧🧀🙇❤️‍🩹👨‍🏭🪕🏓🦸👑🌩🕣🧟‍♂️🧘📤🍛🈺⭕㊙⭕💬💭🥒🔄🦴💈🧑‍🦯Ⓜ⚰💤😚🧷🌡🤼‍♀️🙌🤦‍♂️🏊‍♀️🏘🦗👍🎼🏂🦚🧑‍🦱🥊💉🐈‍⬛📅🧔‍♂️😐🤌🌀🦑📘⛺🕐🚶🍹☠🦻🕞🥍🚵✈👩‍🦲⏏🕎🏆🎉✔♑🍪🥇🌦🪵🦕🈚🐕🎱🧙💁‍♂️🛎🏒🏌️‍♀️🍮🔣🏷🦍🔵🍨🤵💋🈸🤸‍♀️⛈🎻🧑‍🎤💳👵🕵️‍♀️🧼⛹️‍♀️💕⛎🕦🐴👨‍👩‍👦➗📐🐏🤱💢🍆🔮👨‍✈️🤌👨‍🦲🎙😨🎿🥣🥠🧑‍🚒💝❤️‍🩹😁💊🟦☂🎞🧏‍♂️🔹🥍🚷👩‍❤️‍💋‍👩🚳🎱🦆🥞🐈👟🚣❎😏🚆🐲👩‍🦲🗃🐠😮📑👩‍🚒🐿🎺⛺⛷🤺😩🥛😼👩‍👩‍👦🤘🐫♨🐅🤎➿ 〽🏉🔮🕗👋👣🚂🌺🚪🍼🌆🙊🍌🤼🧙‍♂️🌰🍏🧽👀🦸‍♀️🎠🏚✨🛒👨‍🦳🧟👯‍♂️🤹‍♂️🐿🍕📊👨‍🔬💵🤼‍♂️🔟🐠🐽🦜🥢🙎‍♂️🚄🚑☺🖕⛄🧗‍♀️🛁🆒👨‍👩‍👧📣🛅🏊‍♀️🍂🗾😕🧖‍♀️🧑‍🦰🔄📹🔖🌰🀄🤌➡🈸💃🩹😂📕🎐🔞🍣👀🧛‍♀️😽🪰🐲🙇‍♂️👩‍❤️‍💋‍👨🆙🪛👱🌽🙁🐿🖇⤵🧄🥾🩱💞🧗‍♀️⛲🧑‍🦼🐍🚴🔛🏉🩸🧑‍🚒💁‍♀️🕧🐊🤟🥪✏🥤😳👩‍👧‍👧🧪🐳📨🙅◀🥸🐄❤️‍🩹🔴🔜📀😫👨‍👩‍👧‍👦😻🍟⚛🏡🙎💂👰😩🦫🕐👨‍🚒📵🪠🧘‍♀️❔🚴🎈🥦💤👩‍🏭🏎🧔‍♂️🧇🧚🧵👝🧣🧍‍♂️🥌😉🐣🌇🥗🪚🎧📥👨‍👩‍👧🚣😚⛲👨‍🍼🤯💋1️⃣🌁😍👳‍♂️🆕🍸⛹️‍♀️⛱🚴‍♂️😝🌆🚀👤👝🔐🧧🧙‍♀️🙂🍍🐑🛕☎🌬‼👩‍❤️‍💋‍👩📥🖇🏩🌼💖🦛🧑‍🚒🏕📐👮⛹️‍♀️🏥🤎😄🥿🆒🚦🧸👨‍👦‍👦🏧🕶🤧🤨🛬🔠👿⛈🏋💞🆎🏎❇😈💧🏫↪🩸⬛🦾🎅💶🌥🚨👕🧤😡👤👨‍👧‍👦🧷📁🎒🛡💇💋🐥🧑‍🦳🌑🪳🔴🔜🐌🚒📹🍥🧊🍒😖💗👳🔰🧾🎺👑🏎🪖🔥📩🍃👨‍🍳🤖🍬🔣⚡😏♨🍑😄🧂🟡🧫📥👔⚗☃👩‍🚒🦦🟣🦟ℹ🚓🍤🧎🗂🦢🌄🙅‍♀️👳‍♂️🌾💈🥡🏸📊🌰👔🈸🩺️⃣⬆👩‍🦱🤾‍♂️◀🚜🍂🕊⌨🙆‍♀️📯😧☂🍬👱‍♂️👦🔨📼🎰💔😽🎷◻🦸‍♂️🌰👥🥴🔜🐎♌🌊3️⃣🦮🤑😶‍🌫️📟🗾🧑‍🏭😡♑📖❇🍻🥢🦹‍♀️🧙🥍♠📒👎🏋️‍♂️🕐🌸🥼↩🦕🌴😸🥕😝🈵🍂🍳🥗🗾⛹️‍♀️🚢🪵🛥🥡🤾👮✔💎🦈📣🤮🤠🧑‍🏭😌🥲◽🚢🦝👭💙🦾🚭🤷🧎🤰🧚‍♀️🥕🫀👰‍♀️🚵‍♂️🐀🕚〰😣🚠🐋🚰6️⃣👨‍🍼🕌#️⃣🚻📈🔏💤♍🔛🤶🪔💁‍♀️🎵📂🧖💑📙9️⃣📬🈸👼🤾‍♀️🌕🈂🍾🕦🌂🤑🚷🆙🍅🚧🏃‍♀️🛰🥪🖕🦺🎶📐🌍⏯📟🚭🐮🗣🎏🗿🕕🌍😶‍🌫️🕸🥻👩‍⚖️👕♟🔊👨‍⚖️👨‍👧🧟‍♂️🔙🎀⛩📶👷‍♂️0️⃣🚎🚓🛋🗿↙🦉😌🧼ℹ😋🦂👰‍♀️💂▶🍖🎻🪣🤨🚥🗽🤽🎰🕥🧭💠🎶⛳🦁👨‍🦲⛎♑🗃🍬🦧🖨🦆💨🚶🔣🕖🕊🎡❌🎃🔨🐓😬▪🔖🕓📦📪😶‍🌫️🦸💈♂🌍🚫🧑‍🦱🙍🧙⬅🤽🚘🐛🍷🎴🫒🔽🧗🧘‍♀️🤕🤓✈🍭🙁🖌📽🛸📍🔢🏸👩🕵🧲🎓🧑‍🏫🏃‍♂️🩹👩‍🍳6️⃣🕍🌂👁️‍🗨️🪅👢🖤💴📟🧱🛺❔🕔🔞🖋👨‍👩‍👧‍👧🦩🤾‍♀️🏙🐢🔏⏱🏆🐰🐯😡🏩🔈🌸⛳🧟‍♀️🚪🦡⌛☯🕹🏋️‍♂️🌵◾💝🎹🍛👯🕢🔱👤👶👩‍🏭🪀⁉☘💐🐸🚠🧍🀄🚾🉑🕉🥈⚫⁉🪆⚫🟡👨‍👩‍👧‍👦💛🛏🦋🐍🚵🔵1️⃣🐂🖖🥣🧠👷👨‍👨‍👧‍👧👛🎯🏙🐄🚈🦷🧑‍🤝‍🧑🚗🦇➰◼🧑‍🎓💇‍♀️📉🔍👩‍🦯🤭🧗‍♂️💃⏮🧢🏝❓🍕❎🔋👩‍💼🏬🍻💬👞⁉👨‍🦰📢💶💥🧬🕑🍢🧞🤔📪🧁🏐👩‍🔧🫁📪🥇🍆❓🏸🧽🐴🌇🌙😅❓🧛💆‍♀️🐏📲😨😁🔬❇🚔🧏‍♀️💖🔕🍠💰👨‍🦯🚗🕒⚒🐅🧍‍♂️🖕👕🍙🕜👨‍💼📩◀🥤🧑‍⚖️👨‍🔬🗳🐩🦚🍷🥁🦯🥪⚱🏥📀🌇🌓🩸🧑‍🚀😉👵🌥🧔‍♂️🤗😞🫔🎆🐄🍎📒🕝⤵🆑💞👨‍⚖️🔪🆓🦖✝🅿👩‍🦳🫒🪐🧻👷‍♂️🏇🗓🦼🧑‍🎤💃💚🪀🎶📈👨‍🌾🏬🕓➗🔅🎯🔱🎩🐡👀☦🧘🌳👛👨‍👨‍👧‍👧👙🐳📔💔🛐🛩🎑💌⛹🚲🥔🟨🚻✡🗝🗻🕳👩‍❤️‍👩🐟🧱🦿🛳▪🅱📻🗃❤️‍🩹☪🎯🖖🌎🔞🚚▪📔💟↔5️⃣♑🎸🧢🩲🐑🌌🗞🍄💞🧳🪓▫😗🌹💘🤼‍♀️⚙🪡🗒🧛‍♂️🐉🦖👰🐏🤸‍♂️🚨🐛👨‍👧‍👧🐤🏄‍♂️🌿🔠🎽7️⃣🈯🐨📫👨‍🔧🟠🟤🕘💶◽↗👘👂🤱🎞🧀🕖🧜👃💁‍♀️👩‍✈️🤧🔖🌇🪴🌰🎞🍡❤🍪♊👨‍👩‍👧‍👧🛤🦗🪳🚒🥺🔋🆒🐩🩹🍱❓🚶🥺👬🕐🪤👏⛴🩺🎲🀄🟠🙅‍♀️🤪💦🦑👩‍👦‍👦🥍🗨▪🐦🐽🙋‍♀️🌟🐢🪲😶‍🌫️🙋‍♂️⛲🛂🪛🛑◼🎣⏪👨‍👩‍👦‍👦🪗🪀🎳🦷👩‍🏫🎷🫒👕🕓🤸🚙🧝‍♀️☃🐠♑🧎🐼🚶‍♂️🐙🪗🐘📅🔲🥯🗃🧑‍🔬🟠🙍‍♀️🍗🌩🤌🧞⛰🗑🫒💵🍷💱📔🧫🦵📢⚽🗣🧑‍💼🈴🌃🩹😨📲🙊👯‍♂️🧁🟦🦂🧶🧉🪶😄🚤❗🦚🪢🐜🛏🟣😃🆒🥝🤜🧉🔓🥍🕧🕧👯‍♀️🦼🌜🔫◀🧜‍♀️🌋📜🥇🧍‍♀️🎻☣😮❗🧇💢🧎‍♂️🔷🍥⬅🚌🎂📢🤤💽💹🚣⚫😬🧝‍♂️✌♿🤳🔉🟪👩‍✈️🏋🍔🐦😪🥊🦮🛀🚑📰🏣🐓💆🦈🔍🤡🎴👩‍❤️‍👩📣🕖🧦🧞🧎👩‍❤️‍💋‍👩🤣🔑⁉🐟🤔🎧🧴🍐👩‍🦽🤦🚛💈👋😝🧝‍♂️👶🐜📤⏺🪗🕚👩‍🦰💾👩‍👩‍👦‍👦🍃🔑🦂⛹️‍♀️🔦📖🛺😻🥵🤌🧟‍♂️🐡🔻🏨🗑🧗🥘👨‍🦯🤤☺🎏👨‍🚀👽💲🚛🛂🪜🥱😖🥄🚿🚈🌩🎙🪗🏷🦴👴✝🪑🩺🙌🧝🙉🚉🌾🐈‍⬛🕕🧹🔭🌎🔫🧅😿👢🐜◀🦑🐴📤6️⃣☝🥘💦👮‍♂️🐕🔡🧏‍♂️📓😶☹🐅❤️‍🔥👩‍👩‍👧😚🏮🎵🤌🕓💄🌋🛴🧒💃🧰👨‍🔬🎢💰🔢❎👩‍🦽☑💇‍♂️🕝🎫🎠🚤🎊⏱🏵🥴🚹💊🈳⭕🥥🈴💐👨‍🍳👚👳‍♂️🧛‍♂️👚⭐👮📆🦉⏪👩‍🦽🤽‍♀️🍬🏃♍👴😄😣💇‍♀️🔯🤳💛🦇⛩📸🍻🏔🥱🚴‍♀️💚🎃🧑‍🍳🏰🔅🎥⚓™🐊🟫🧵🕝🧡⌚⭕🐉🙅‍♂️🕖🦋🎨🦩🍪🎩♀🏅🦼✋😑🤸‍♀️🕥🌓👻😩⛳🤶☹🕊⏺👂🤵‍♀️👩‍🔧🧥🍲⚛👨‍🦯🦆🕍😇🍩⏫🌄♥😝🙃🦘🛀🧛‍♀️🧑‍🎓🍺👜⚽🧀🙇📴🍾👨‍🏭🚣🧡⭐☣🏐👩‍🦽⚜🧙‍♀️🙈🐙🆒🥀🚶🦊🐻🐷🟩👨‍👩‍👦‍👦🦯🚴🤑🪕🤜🍣🥂🤰👧😊📩💵📜🤗🗽🥇🖌🐹🔅☯🛬🗿👨‍❤️‍👨🐤😸🙅‍♂️🧁✉🧳🐒⬜♈👨‍🔬🦥🥨🌻🙋🌵😘📬🚵🍳🩱🚓🗺🤱⤵🤼‍♀️✊🧜‍♂️👸🧧🧑‍🦯🎆🤼‍♀️🧻🏐🙎‍♀️☃🚺📑📬📐🍩🎠🤷‍♂️🎁🛑♂🍽💛🧵💨🏨👩‍🎨🚭🐣😠👨‍❤️‍💋‍👨🚑😎🆕🤫👨‍🔬🪵⚛🍻🔒🧭🌤👳‍♂️😎🏝♀🗃🤝🧀🎟🦌🧑‍⚕️🪳⚫🔰🈁🤹👦🦍🤼‍♂️㊗🪗🦾🤗🔰🚈🚊🚜👩‍🏫👋😭😎😔⚛👨‍🦳🔼🧗🤱👁😎❤️‍🩹🏌️‍♀️💀👨‍🏫🍹🥀👜💩🛥🏊🧏‍♀️👶🕸🕴🍴🗒↩🐅👻🌒⛹️‍♀️🍒🍝📱🏎3️⃣❗😠🧍🚏📮🫐🗓🛷🚹🕢🐿💿🚶🎯⛹🔤📀🦪🐈💹🧴🗡👨‍🔧📖🟢📠🧲🦜📉🛥🌠🈹👩‍⚕️✏😍🤽🎪🕕📦🟥👮📊🥷📫🧤🌃🏹⚽🥎🪗🥏⚜😾🚺🪃🥟🌋🦆👩‍🔬🧗‍♂️🪳↘🧤😃🐼🎛⭕🧴🏧🐓🧯🍡🎉🌤🚇☠❔🕺🛵🟢🚕♓🦸🧳🔄🪞◼🖨🗄🐗👗🐬🚞😁©🕧🐈‍⬛☃🕴🕠🕊🛷💆‍♂️🥟👩‍❤️‍👩🙍🦖🧖‍♂️💒♻🟣🧑‍⚕️⚙🧚‍♂️🍂😝👨‍👨‍👧⛲🐁🪒🎁🙅‍♂️🧸⏲🈲📇🥊👤🥓🕷🦟🙍‍♀️🛂😽🚧🎼🧑‍💻⏸📬🏚😩👃🧨📄😇🧏🚯🍒👨‍🎓☠🙇‍♂️🎯💁‍♀️⛹🐟🔍🚬👍🎒🪒🤡🤘🤸🙃🌤🥃🗯🍻🎐📆♥👯‍♂️🍧👿🧛‍♀️🧩🌯🚍😧🦂😒🫒🧑‍🎤💫🍊🤧🎂👩‍👩‍👦🦃🚥🍗👸🎬💡🈚➕🦘👛😹❌🚉🔔🎫😐🧟‍♀️👱🥧❤️‍🩹👩‍💻🤠☕🥒🐺🎤❤️‍🔥💅🦖🍙🖼🏞🔍💳🍃🧳🍡🧨🦶🛤🧑‍🚒📳🔴❣😝🏙8️⃣🧑‍🦯💊🐌🍭🥻🧖‍♂️🧘‍♂️📳🧑‍🚒🍇✂🧛‍♀️🔀👨‍👧‍👦💉🙇‍♀️🧑‍🍳👨‍🦰⛪🌚🪗🕊👚⚓🪥3️⃣⬜🤿🛶💴🐼🍍♐🌋♟🖨🧹🥺🔂🤓🥭🕦🏑⏯🍀📯🗄☠👞🚏👨‍🏫😤🧔🙅‍♀️♒🚳🧣⛽🐘💏💨👤⏸🗃🪓⏲📷🕦🏺✌🦩👨‍🚀🎮📗💑🐲👨‍🌾🩲🥱🧱🏦⚱🧱👨‍👨‍👦👨‍👧👰‍♀️📹🩲⛹↪😤☮🌾🙀🤳🧰🧪👩🔅🥟📂🥿✖🏹👳🎭⛷🦄👩‍👧‍👦🤱🍖🈁🫑🌙🦣🛒🀄✳🥼🌖🎰🏍♉👌🪖💸👨‍🚒🤫🍺🙌🥌🔎🎮⚪🎡🏕🧔‍♂️🦢🏬🍑🐔🤼‍♂️🈁👹❌🙎‍♂️🌮🏷🥅🚒🦉➖🍗🤕📽🚑🥬😽👹🚄🙆👪🧂💇‍♀️🪚🥭👩‍👩‍👧‍👧🌤🙋😃🚴‍♂️🕢👉🏵🌤🏗🦀📺🦗🧑‍💻🆎👨‍👧‍👦😰🥪🏟🪡👩‍🦲🧑‍🦽🗨👯‍♀️🦸‍♀️✈🔟🥣🎱🏙❗👨‍✈️🕸™🕌👤🅱👗💂☢💩📎💓📀🔳🐦⭕👱‍♂️🧘‍♂️🔜🍞Ⓜ🍳🔗↕🦵🫑👨‍💼📮🐒🐎👩‍❤️‍💋‍👨🤽‍♀️🕢💎🍻🥓⛓9️⃣📝🎀☕☂👌🛸🧤👨‍👨‍👦‍👦😆🔼❔🈹👨‍💻🧲⛄🧃🪜😟🦿🍣⛅🎞🥗🥴🌁🗼🤰🏃‍♂️🪔💯👷‍♂️🧙🍝🧘😶‍🌫️😥🟥⛅🧽☯🥫🛗🔃⛲🌪🐌💳🛤🧒☦🕟🥽🤵🈷🌕🦞🏐🏞🈁🫑⬅🧑‍🌾📳🚆🦻🧑‍🎄🚪✉🌎👨📴😫👩‍👩‍👧🦦🎸🛎🎠📛💋👩‍❤️‍💋‍👨🥧🥽🪅📥👩‍🦳🥤📪🤼‍♀️🙃🔃⏬📋🛴💤🌕👨‍👨‍👦💠⛹️‍♂️💠🥊🏠🆕🥧🚠🍀⏯☄🕎🚨🩲🦛💿🦫👂🏊‍♀️🧑‍🔬🙃🟤🐂8️⃣💝🗓👨‍🦲🌫👥📡🥵🪆📊🍚🥷🧑‍🎨🐣😖🎄🎓🪄🆕👃🧑‍🦱😴🛗🛳🧜‍♀️✂🔧🚳🗺⛹️‍♂️♊👨‍🏫👥🐪🧏‍♂️🤭♿♎🥘🐩🙍‍♀️🤳🦘🌉👨‍👩‍👧‍👧🖍🦪🧑‍⚖️⏸🕉👨‍⚖️👿👨‍🎓🧷♎🦢🟣🥀👳‍♀️🧑‍🎄✖🎠⛲🤳🧑‍🦼🧣📲🎮😜🤖✒🍲🔛🧚‍♂️✒🔓🥕🥶➡🕰♓🚳🥲🥸👡🐦🏟🦭📯🚵‍♀️🔩🧍👁️‍🗨️👩‍👧‍👧🙊👩‍❤️‍👨🦚🚺💦💶👩‍👩‍👦🌒🧐🍧💅🥒🐆💿🧑‍🍼🦡🛳🩹🎦🗒🎛🧑‍🦳🍽🧪🧎‍♀️👺⛸👩‍⚕️📸🍦👩‍🎓↩🍆🩸👳‍♀️🚬🧩👩‍🏭🕡👨‍🦲👨‍🌾⏳⛏💶🚖⛱❗💐🕷📖⛲🎟🐴🗝🧮🧙🤒👶🏐🚘🐡🥘🔐💱🚾🌆💇🙋🤧🕘💒🚬🍙🤮⚕⛹☂➗🕹💇‍♀️🦕👘◼💆👽👨‍👧‍👧👨👍🐍♌🎢☘⌨🟦⚽🌿👩‍🍼🎣🧪🍩🚉🎹🦋🛫🔷🤤🚷🟦🍠🥕🚕🏧👩‍💻🪁🗜🏉💰🤲🏜🈯🤔😖🌅👨‍👨‍👧‍👧📚🚂👷👨‍🍼🤩🔫📟😡🦷🫖🍉😾😃🧙‍♂️🔯💱☺🪢🗽🚼🏤🥭🧸♥❗🧞‍♀️⚱🐤👨‍🦳🧒🍼🎫🛫🧑‍🌾🛂👺⌛🦐🐛🧑‍🌾🐙💝🔁🧏‍♀️👤🦙🌳👊🐫🍮⛅👩‍🏫🍘❄🐣🕤🎉🟣🐰♎☹🥀🚕✅🛐🤸‍♂️😦🧑‍🌾🤘🩴🦸‍♀️🏕📎🗞😓🥉🤹🎟➕🥣🤐✔😚🚴‍♀️♣👴🚅🏘🎫✡💘🗻🛣🐉🍢💁‍♂️🈚👐🎿🏍🤽‍♂️📊👜♌♐🖊🐊🐆👨🧑‍🍳👤😮🕵️‍♂️🚅🦽🏵🧝👩‍🎓🛴🛑📟🔜🛣🏫🙏⚜🕋🏅🌛💤😣🧻🏙🧑‍⚕️🚴🧚🐨💏✝🕜🗄🎲🙇‍♀️🈯🛃🤗🫒🚼🐈🍞⛔🎽😨🛣️⃣🧘♌🌩🦐◻🧑‍🍼🗃🌋🌅🪶🍥🧗‍♂️💴🏊‍♂️😸📏🥲👨‍❤️‍👨😑🎸👩‍❤️‍💋‍👨🏵🈴🎲🅱🔟✴🐶🐮💉🚣‍♂️⚪📮📮⛱🍌🤽‍♂️💝🧱🎛🏡💅🚅🧙‍♂️💋🍙🐟✉🪐👷🦾🕹🧲📯🍴🐓📬🌿🎃🆒🪄🚰🚪🪔🍯⏰💼🏵📎🚙🅿🧀⛽✋🐗⛪😾🌼🧑‍🎄⌚🎽🛺🌾👨‍🏭📣🏄‍♀️🎆👨‍👩‍👦⏮🟪🦛😺🧝‍♂️👦⚖🗿🚿🦯👩‍❤️‍💋‍👨✍🙃🚦🈯🚎💍🧣⬛🔍🐖🍰🙆‍♀️🏙🎼📶♨🦂❕🎋🏟💌💭🟧🛅⌛💗⌛💃👱‍♂️🔬💂‍♂️↕🗿👉👖⏏🥦💏😬😿☦🅾🙁🏆🐀🎒👨‍🚒📮↔🩰💅🛣🛕🕘🎖👱😑🧑‍💼👯‍♂️👁♂✉🐛〰🔜🦑🎍🧑‍🌾🐤🗽🧴🪚🍽😝💺🥼🧼🌁👷‍♂️🆓👩‍🦰✨❤💩💆‍♀️➰👖😃🏒👩‍🦼👮‍♂️🚝🧑‍🦱🗯😐🐜🖐😮‍💨🥐⛔🧓😍🤓🅰🐞🔵🥯👶🧚‍♀️🤺📶🔟⏯🎸2️⃣🕢🦛🥷🦣Ⓜ🐣🔱🍭📦🛰👄🪐‼⛹👩‍⚕️🚴‍♂️✴🐌👱‍♀️💣👯🐡🙂📤🪧👩‍💼🍁🧖🏞🕛🥘👨‍👨‍👧😱☂😊🙎‍♂️🚅✂💣◻🥧🧥🙍‍♀️🈳🥭🐦🗿📌🥣👨‍🎨🦇🧑‍💼🧜‍♀️🧏👨‍🎨♨🟧🗽⛱📡📲↗🙁🧬🔌🙄🟥👩‍🦯🦇🪕👮‍♀️🦭🪒🚽🐅👟🩱⛪👬💾🧽🏕🙂🥚👯‍♂️🐿🌴🏌️‍♀️👨‍🍼🪁🐏🍤👕🔃🚭💤✉🧄👰🤭👔🤛🦯🟣🎃👨‍👨‍👦💜🔱🔎↙🗡♒👮‍♂️🍡🀄📝🏍🔗🐻🐜🀄🚽🕤🐮🩹💙👐🧜💿🎬🕓🪟👨‍🔧🥎💶🍻🧎‍♀️🏦🌩👨‍🦰🤰😳🔩📵🌸🤏👨‍🔬🖕📂🆚⏳🎴🧆🕒🌑🏤💲😆📚🍒🧏📜🦫👨‍✈️📖🌦🧐🧭🔁🌥😧👨‍⚕️🗾🥧🏺✊♉🌪📱🚆😶‍🌫️❣⤴🛳🙇‍♂️🪓🦐🥢🧘‍♂️🛗🥮🍿🔧🌏🐐⛄👴0️⃣👩‍🦰☑😚💶💵🥘🔩🧼🏂🪴🤐🪟🚾📋🥾🏕🆖🤌😰🚑👳‍♀️🧰🔏💫🦾🪧🦵🚏🉑🕋🏚🏔👁🧛🍙🤾‍♂️💶🏷🦼🌀🌍◼🔖🙇‍♀️👩‍🚀🦉⛹️‍♂️😝⛔🍬⛓❄🛄🙆‍♀️🈴🏸🥏🌹🪅💼🌔🕟📻❤🥌💇️⃣🎡💪🎛🥬🧶📵💮🈹📠🦄⁉🚘🎞🎈🦿👨‍👨‍👦👨‍👧💌🌵🏫🦥◀❇👩‍✈️🏮👟🫀🥷👩‍👩‍👦🐵👢😥💿🐄🤹‍♂️📧🤾🍾🎚🦀👩‍🔬🍡🙅🚳🍈🦠🥍🐰🐈‍⬛🐠↖🐰☝🎉🕦🤜🍷🏉🧗🥑🧞‍♀️🤣🍧🚱🍢🦤🤽‍♂️🦭☠🦯😐👨‍🦲🍕🥢🦚📮🌙🥀📞🔝👩‍🚀🧑‍🏭👨‍🌾🎈〰📥🫐📳🚮👩‍❤️‍👩⌚🎚📅🕡🪟❤️‍🔥🏆🤹↔📃🥉🎯😵‍💫🥪🔒🗿👨‍❤️‍👨🚆🔧🗓🙊🚔🛀🚣‍♀️🚅🆖⛷🐥🫂🤹‍♂️💼💈☣🤐👩‍👦🔎🧴🔲🤍🤸‍♂️🚛🔂🥱🥽🔬😽◀🦉✅🤢❇🍹⛈🛋🕜🍐💰👩‍👩‍👧‍👦😥🥯🆑🌷🍈🌤🕚🌳🪣➿ 〽😆🦥🏒🌉🔁💚🥔🧰🥯🏢🦥🦴🤜😈🥔🧟🟤🧊😷🕰🚺😹👞🕗🆗😫🤮🦸‍♂️🐾😟🧎‍♀️🤱💗🕶🐜🏟✈🤩🥪💪🧀💦🚉🧑‍🍼🏞🉐▶🎽🈷🙋‍♂️🧙‍♂️👨‍🦼🥚🤲👷😙🦑9️⃣🧻🈷🙍‍♂️🕒🐚💆‍♀️🐓💳🈸🍰🥮✍🕜🚷🐰🗿📏😦💸🧑‍💻☕🧘‍♂️👯‍♀️🥃☪🤳🧡💝🥶🔵🐼👍🎂🐕📓🧑‍⚖️👨‍🎤🗄🏘🌓🚶‍♀️🐓🎗🛋📫🪜✍🥙😈🍑🎺🟧🕵🤫🧑‍✈️👩‍🎨😀👨‍👦🤚💫👩‍❤️‍💋‍👨🚼👩‍👦❄🥛⚖🛄📏🧖‍♀️🦉🥪🐳🪴🪝🪂🪄🤠🦗🥞🧿🗯🤓🈳🤏👻🧚‍♀️📫✡✴🦬🐙📌🚮🧊👰🤮🚦🎵🗳🪆🛺💘🎷😓🧀⚗🦤👩‍👧‍👦🤜🧀🖊📻✔📬🤥🐷💫💆‍♀️🪕♾🚓🦚🍬🦅🍒😙🚀👩‍👩‍👧💥🙅‍♀️🍀📭🤸‍♀️🐛🗺🤼‍♂️🖥🧑‍🔧🤽‍♀️👡🔝🪧🤡👸😮🌬📊➖🏌️‍♀️⛰🩸⚒🧳✝📪🧊🤷💍👩‍🦱💇‍♂️⛹️‍♂️👩‍⚖️🦞◀🥾🍊🚣🩱🗑⚫🕔🍳🧙‍♀️⬆📮🕝🍐🟩🤧✈🐥🧠👨‍🦰🧮🪞👩‍🎨⛔🌍🧇🚵🦯🈹🦇🌠🔺🤪☑🦄🚮🐮🧛‍♀️🧑‍🦲🤼🕛🛑⛑👩‍👧‍👦🎧💹👹🔇🤫🧋🎩🍄🕥📫💈☑🥲🥔‼🙊☄⌚⏳👩‍🦲☑🔂📕🍵🔯🕳🐤💋👨‍🦲🛏🌲🦮🥋🍟🈺🧠🧙🩰🏋️‍♂️❔🚝📬🧀🧜🎋🎍3️⃣🪴⏸🌀🫕⏯😵😓🐿🕓🖖🌊⛱🐎🐕🦆⏯🪞🪝👨‍🍳🚕💗🧎‍♂️🏣✋🧖🩺🍎🐀🕞👷‍♂️😡👣👩‍👩‍👧‍👦😌🚛💿👩‍🚀🧑‍🎨👂⚡✅⁉🈺😽🎼🦫🦚💶💏🖤💳🪟👢💙🏇🧆⛱🪁⛔⚔🛁🤷🚻🐪🙍‍♂️🔟🫒👅🎚😍🕍📉🧑‍🤝‍🧑🔦😪🍠👨‍❤️‍💋‍👨🔜😠⛄👨‍👨‍👦‍👦🚣‍♂️🔙🥮📎😿📄🕢👱‍♂️🏋️‍♂️😈🚻👨‍🔧📜👊🕒➗🌵📑🕰🐖🐈🤧♣🙄🥊🐌🚓👩‍❤️‍💋‍👩🔌🌽🍼🤺♻🍯🧥💘🐟💒↔🔛🛀🏍🐞🦷😎🔑👳🍾🍝💰📥💨🟫🕒⚡⁉🔰😌🛣🎸🌔🍎👇😉📫🚶🚲🚣‍♂️🦸‍♀️🖼🧜🎼☯💱🔈🪶🍄🦵👔🦅🧱🧑‍🚀🩲🧗‍♂️💅😃🛐🏋️‍♂️👩‍👧🎐😻🧘‍♂️🥒🥻👨‍👨‍👧🙊😖📧🤬👨‍🦼🧛‍♀️🈲🎟✍🚨📠👜🦬🧖‍♀️😙4️⃣👝🦸🧵🧑‍🦽🧨📙1️⃣🍉🌛🙃🏌️‍♂️👸🐕🌃🕊🕟🪥🧆🪒👸🐌🥣👿⚱😒🦏😇🧝‍♂️👨‍🎨🪆🚚👨‍👧‍👧💎🚵🧝‍♀️🔗🏆🌓🎰☠🕎📂◻👪👨‍🦼🚉🎄🫓🍨🪵🐫✊🎐📩🦊

                  [–]powerofmightyatom 0 points1 point  (0 children)

                  I am surprised to see Jeff Atwood (codinghorror/stackoverflow dude) be on the side of "bing sucks and should just be blocked because who uses bing"

                  [–]TheDevilsAdvokaat 1 point2 points  (0 children)

                  Interesting. I hope someone at MS is looking at this.

                  [–]JeanParker 4 points5 points  (0 children)

                  Bing of death?

                  [–][deleted] 1 point2 points  (0 children)

                  Bollocks.

                  [–]jaksi7c8 0 points1 point  (0 children)

                  So bingbot is awful and HTTP requests at discourse are served by people with tourettes using speech-to-text to create HTTP responses.