blog

Top Crawlers Bots IP Ranges For Search Engine Optimization

In the world of Search Engine Optimization (SEO), understanding the behavior of search engine crawlers is crucial. These crawlers, also known as bots or spiders, are automated programs used by search engines like Google, Bing, and others to scan and index the content of websites. By identifying the IP ranges of these crawlers, webmasters can optimize their websites more effectively. This article delves into the top crawlers, their IP ranges, and how this knowledge benefits SEO.

ENGINEENDPOINT
Google IP Rangeshttps://www.gstatic.com/ipranges/goog.json
Google Botshttps://developers.google.com/static/search/apis/ipranges/googlebot.json
Google Special Crawlershttps://developers.google.com/static/search/apis/ipranges/special-crawlers.json
Google User Triggeredhttps://developers.google.com/static/search/apis/ipranges/user-triggered-fetchers.json
Global and regional external IP address ranges for customers’ Google Cloud resourceshttps://www.gstatic.com/ipranges/cloud.json
BingBot IP Rangeshttps://www.bing.com/toolbox/bingbot.json
DuckDuckGo Botshttps://duckduckgo.com/duckduckgo-help-pages/results/duckduckbot/
Ahref Crawler IP Rangeshttps://api.ahrefs.com/v3/public/crawler-ip-ranges
Yandex IP Rangeshttps://yandex.com/ips
Facebook IP Rangeshttps://developers.facebook.com/docs/sharing/webmasters/crawler/

ReferencesLink
All Crawlers User Agentshttps://gist.github.com/josuamarcelc/6bfbdc14c6292e195844032bea7211d1
Google Crawler Indexinghttps://developers.google.com/search/docs/crawling-indexing/verifying-googlebot
Yandex Robotshttps://yandex.com/support/webmaster/robot-workings/check-yandex-robots.html
Moz RogerBothttps://moz.com/help/moz-procedures/crawlers/rogerbot
Verify Bingbothttps://www.bing.com/webmasters/help/verify-bingbot-2195837f

Cloud IPsReference Link
IP Ranges v4https://www.cloudflare.com/ips-v4/#
IP Ranges V6https://www.cloudflare.com/ips-v6/#
API IP Rangeshttps://api.cloudflare.com/client/v4/ips
Yandex Cloud IPshttps://cloud.yandex.com/en/docs/vpc/concepts/ips

Understanding Search Engine Crawlers

What Are Crawlers?

Crawlers are automated programs that visit websites to read and index their content. They follow links from one page to another, thereby creating a map of the web that search engines use to provide relevant search results.

Importance in SEO

Recognizing crawlers is essential in SEO as it ensures that your website is indexed correctly. Proper indexing increases the chances of your website appearing in search results, thereby driving organic traffic.

Top Search Engine Crawlers and Their IP Ranges

Googlebot

  • Primary Role: Indexing websites for Google Search.
  • IP Range: Googlebot IPs typically fall within the range owned by Google. However, due to the vast number of IP addresses Google owns, it’s more efficient to verify Googlebot by using the reverse DNS lookup method.

Bingbot

  • Primary Role: Crawling for Microsoft’s Bing search engine.
  • IP Range: Bingbot also uses a range of IP addresses. Similar to Googlebot, it’s advisable to use reverse DNS lookups to confirm the legitimacy of Bingbot.

Baiduspider

  • Primary Role: Indexing for the Baidu search engine, predominantly used in China.
  • IP Range: Baiduspider’s IP ranges are published by Baidu and can be found in their webmaster tools documentation.

Yandex Bot

  • Primary Role: Crawling for Russia’s Yandex search engine.
  • IP Range: Yandex provides a list of IP addresses for its crawlers, which can be found in their official documentation.

Why Knowing IP Ranges Matters

  1. Security: Distinguishing between legitimate crawlers and malicious bots is crucial for website security.
  2. Accurate Analytics: Identifying crawler traffic helps in obtaining more accurate analytics data, as it separates human traffic from bot traffic.
  3. SEO Optimization: Understanding crawler behavior helps in optimizing websites for better indexing and ranking.
  4. Resource Management: It helps in managing server resources effectively, as crawlers can consume significant bandwidth.

Best Practices for Managing Crawler Traffic

  • Robots.txt File: Use this to guide crawlers on which parts of your site to scan and which to ignore.
  • Monitoring Server Logs: Regularly check server logs for crawler activities to ensure that your site is being indexed properly.
  • Updating Sitemaps: Keep your sitemaps updated to aid crawlers in efficient website navigation.

Conclusion

Recognizing and understanding the IP ranges of top search engine crawlers is a vital aspect of SEO. It helps in distinguishing between genuine search engine bots and potential security threats, enhances website performance, and contributes to more effective SEO strategies. As search engines evolve, staying informed about crawler activities and best practices is essential for maintaining and improving your website’s search engine visibility.

100 Reasons To Stay Alive

“Reasons to Stay Alive” is indeed a novel written by Matt Haig. It was published in 2015 and is a memoir that explores the author’s personal experiences with depression and anxiety. The book combines Haig’s own journey with mental health issues and insights into how he found reasons to keep living, providing a mix of personal narrative, reflections, and thoughts on mental well-being.

Throughout the book, Matt Haig shares his struggles with mental health and offers a message of hope and resilience. The title suggests that within the challenges and darkness of life, there are reasons to find joy, purpose, and meaning.

It’s important to note that if you or someone you know is struggling with mental health issues, seeking support from mental health professionals, friends, or family is crucial. “Reasons to Stay Alive” is just one person’s perspective, and individual experiences with mental health can vary.

  1. to make your parents proud
  2. to conquer your fears
  3. to see your family again
  4. to see your favourite artist live
  5. to listen to music again
  6. to experience a new culture
  7. to make new friends
  8. to inspire
  9. to have your own children
  10. to adopt your own pet
  11. to make yourself proud
  12. to meet your idols
  13. to laugh until you cry
  14. to feel tears of happiness
  15. to eat your favorite food
  16. to see your siblings grow
  17. to pass school
  18. to get tattoo
  19. to smile until your cheeks hurt
  20. to meet your internet friends
  21. to find someone who loves you like you deserve
  22. to eat ice cream on a hot day
  23. to drink hot chocolate on a cold day
  24. to see untouched snow in the morning
  25. to see a sunset that sets the sky on fire
  26. to see stars light up the sky
  27. to read a book that changes your life
  28. to see the flowers in the spring
  29. to see the leaves change from green to brown
  30. to travel abroad
  31. to learn a new language
  32. to learn to draw
  33. to tell others your story in the hopes of helping them
  34. Puppy kisses.
  35. Baby kisses (the open mouthed kind when they smack their lips on your cheek).
  36. Swear words and the release you feel when you say them.
  37. Trampolines.
  38. Ice cream.
  39. Stargazing.
  40. Cloud watching.
  41. Taking a shower and then sleeping in clean sheets.
  42. Receiving thoughtful gifts.
  43. “I saw this and thought of you.”
  44. The feeling you get when someone you love says, “I love you.”
  45. The relief you feel after crying.
  46. Sunshine.
  47. The feeling you get when someone is listening to you/giving you their full attention.
  48. Your future wedding.
  49. Your favorite candy bar.
  50. New clothes.
  51. Witty puns.
  52. Really good bread.
  53. Holding your child in your arms for the first time.
  54. Completing a milestone (aka going to college, graduating college, getting married, getting your dream job.)
  55. The kind of dreams where you wake up and can’t stop smiling.
  56. The smell before and after it rains
  57. The sound of rain against a rooftop.
  58. The feeling you get when you’re dancing.
  59. The person (or people) that mean the most to you. Stay alive for them.
  60. Trying out new recipes.
  61. The feeling you get when your favorite song comes on the radio.
  62. The rush you get when you step onto a stage.
  63. You have to share your voice and talents and knowledge with the world because they are so valuable.
  64. Breakfast in bed.
  65. Getting a middle seat in the movie theater.
  66. Breakfast for dinner (because it’s so much better at night than in the morning).
  67. Pray (if you are religious)
  68. Forgiveness.
  69. Water balloon fights.
  70. New books by your favorite authors.
  71. Fireflies.
  72. Birthdays.
  73. Realizing that someone loves you.
  74. Spending the day with someone you
  75. Opportunity to create meaningful and lasting relationships.
  76. Potential to learn, grow, and evolve as a person.
  77. Joy and happiness in the little things.
  78. The power to inspire others.
  79. The ability to create art, music, and other forms of self-expression.
  80. To explore different cultures, traditions, and ways of life.
  81. To make a positive impact on the environment and help protect the planet.
  82. Experience the joys of parenthood and raise a family.
  83. Learn new things and develop new skills.
  84. Create a legacy that will outlive you.
  85. Being wrapped up in a warm bed.
  86. Cuddles
  87. Holding hands.
  88. The kind of hugs when you can feel a weight being lifted off your shoulders. The kind of hug where your breath syncs with the other person’s, and you feel like the only two people in the world.
  89. Singing off key with your best friends.
  90. Road trips.
  91. Spontaneous adventures.
  92. The feeling of sand beneath your toes.
  93. The feeling when the first ocean wave rolls up and envelops your toes and ankles and knees.
  94. Thunderstorms.
  95. Your first (or hundredth) trip to Disneyland.
  96. The taste of your favorite food.
  97. The child-like feeling you get on Christmas morning.
  98. The day when everything finally goes your way.
  99. Compliments and praise.
  100. to look on this moment in 10 years time and realize you did it.

Ps : Never forget you are a beatiful person 💕 Life is so beatiful so live, live like no one else exist, live for yourself, don’t care of bad people, you are strong, i love you

Quota Exceed While Request Indexing to Google Search Console

Search Analytics

Search Analytics quota falls in two types: load limits and QPS limits. The “quota exceeded” error is the same for all quota exceeded events.

Load quota

Load represents the internal resources consumed by a query. Most users will not exceed load limits, but if you do, you will receive a “quota exceeded” error message. The Search Analytics resource enforces the following load limits:

  • Short-term load quota: Short-term quota is measured in 10 minute chunks. To fix:
    • If you exceed your quota, wait 15 minutes and try again. If you still exceed quota, you are exceeding long-term quota.
    • If you are exceeding short-term quota only, spread out your queries throughout the day.
  • Long-term load quota: Long-term quota is measured in 1 day chunks. If you exceed quota when running only a single query inside a 10 minute period, you are exceeding your long-term quota. To fix:
    • Queries are expensive when you group and/or filter by either page or query string. Queries grouped/filtered by page AND query string are the most expensive. To reduce your load for these queries, remove the grouping and/or filtering for the page and/or query string.
    • Query load increases with the date range queried. So queries with a six month range are much more expensive than a query with a one day range.
    • Avoid requerying the same data (for example, querying all data for last month over and over).

QPS quota

The Search Analytics resource enforces the following QPS (queries per second) QPM (queries per minute) and QPD (queries per day) limits:

  • Per-site quota (calls querying the same site):
    • 1,200 QPM
  • Per-user quota (calls made by the same user):
    • 1,200 QPM
  • Per-project quota (calls made using the same Developer Console key):
    • 30,000,000 QPD
    • 40,000 QPM

Example

  • User A can make up to 1,200 QPM combined to her 3 websites.
  • Users A and B can make up to 1,200 QPM combined to their one website.

URL inspection

  • Per-site quota (calls querying the same site):
    • 2000 QPD
    • 600 QPM
  • Per-project quota (calls made using the same Developer Console key):
    • 10,000,000 QPD
    • 15,000 QPM

All other resources

  • Per-user limit (calls made by the same user):
    • 20 QPS
    • 200 QPM
  • Per-project limit (calls made using the same Developer Console key):
    • 100,000,000 QPD

Reference: https://developers.google.com/webmaster-tools/limits

Yoast SEO Certification in End of The Year 2023

Is SEO certification worth it?

SEO certifications are a fantastic way for people who are new to the industry to learn the basics of SEO and prove to current and future employees that you have a solid understanding of the industry. They can also teach website owners and marketing managers to do simple SEO tasks.

Yoast SEO Certification

Certification Description
All-around SEO – In this course, you’ll learn practical SEO skills on every key aspect of SEO, to make your site stand out.
Yoast SEO for WordPress – In this course, you’ll learn about how to set up and use the Yoast SEO for WordPress plugin so it makes SEO even easier.
SEO copywriting – In this course, you’ll learn how to write awesome copy that is optimized for ranking in search engines.
Understanding structured data – Do you want to take a deep dive into structured data? In this course, you’ll learn the theory related to structured data in detail.
Keyword Research – Do you know the essential first step of good SEO? It’s keyword research. In this training, you’ll learn how to research and select the keywords that will guide searchers to your pages.
Local SEO – Do you own a local business? This course will teach you how to make sure your local audience can find you in the search results and on Google Maps!
International SEO – Are you selling in countries all over the world? In this course, you’ll learn all about setting up and managing a site that targets people in different languages and locales.
Ecommerce SEO – Learn how to optimize your online shop for your customers and for search engines!
Block editor training – Start creating block-tastic content with the new WordPress block editor! Learn all about the block editor and what you can do with it.
Technical SEO: Crawlability and indexability – You have to make it possible for search engines to find your site, so they can display it in the search results. We’ll tell you all about how that works in this course!
Technical SEO: Hosting and server configuration – Choosing the right type of hosting for your site is the basis of a solid Technical SEO strategy. Learn all about it in this course!
Structured data for beginners – Learn how to make your site stand out from the crowd by adding structured data!
SEO for beginners – In this free course, you’ll get quick wins to make your site rank higher in Google, Bing, and Yahoo.
WordPress for beginners – Do you want to set up your own WordPress site? This course will teach you the ins and outs of creating and maintaining a WordPress website!
Technical SEO (deprecated) – How can you detect and solve technical issues that prevent your site from ranking well? No technical background? No problem!

Google crawlability and indexability are fundamental concepts in search engine optimization (SEO) that determine how well a website’s content can be discovered and included in Google’s search index. Let’s explore these concepts:

  1. Crawlability:
    • Definition: Crawlability refers to the ability of search engine bots (like Googlebot) to access and crawl the pages of a website.
    • Importance: If a webpage is not crawlable, search engines won’t be able to discover its content. Factors that affect crawlability include the website’s robots.txt file, the structure of URLs, and the use of navigation elements.
    • Robots.txt File: Websites often use a robots.txt file to provide instructions to search engine crawlers. This file can specify which parts of the site should not be crawled. However, it’s crucial to ensure that important content is not unintentionally blocked.
    • XML Sitemap: Creating and submitting an XML sitemap is a best practice. The sitemap provides a list of URLs on the site, helping search engines understand the structure and prioritize crawling.
    • Website Architecture: A well-organized website architecture with clear navigation paths aids crawlability. Internal links between pages also contribute to effective crawling.
  2. Indexability:
    • Definition: Indexability refers to whether the content crawled by search engines is eligible and suitable for inclusion in the search index.
    • Importance: Even if a page is crawled, it may not necessarily be indexed. Factors that affect indexability include the quality of content, the presence of duplicate content, and the use of canonical tags.
    • Content Quality: High-quality and unique content is more likely to be indexed. Google aims to provide users with valuable and relevant information in its search results.
    • Canonicalization: Duplicate content issues can be addressed using canonical tags. These tags specify the preferred version of a page, consolidating signals for similar or identical content.
    • Meta Robots Tags: HTML meta tags such as <meta name="robots" content="index, follow"> can be used to explicitly indicate that a page should be indexed.
    • Noindex and Nofollow: Conversely, pages can include meta tags like <meta name="robots" content="noindex, nofollow"> to instruct search engines not to index or follow links on the page.
    • 404 Errors: Pages returning a “404 Not Found” status code are generally not indexed. Regularly address broken links and 404 errors.

Understanding and managing crawlability and indexability are essential for effective SEO. Regularly monitoring these factors, using tools like Google Search Console, and following best practices help ensure that your website’s content is properly crawled, indexed, and made available in search results.

Cornerstone content refers to the most important and foundational pieces of content on a website. This content is typically comprehensive, authoritative, and serves as a cornerstone for the rest of the site. Cornerstone content plays a crucial role in providing a solid foundation for both visitors and search engines.

Key characteristics of cornerstone content include:

  1. Comprehensive and In-Depth: Cornerstone content is usually more extensive and in-depth than regular articles or blog posts. It covers a broad topic relevant to the website’s main theme or niche.
  2. Authoritative: Cornerstone content establishes the website as an authority on the chosen topic. It showcases the expertise of the content creator and provides valuable information to the audience.
  3. Evergreen: Ideally, cornerstone content is evergreen, meaning it remains relevant over time. While regular blog posts may focus on current events or trends, cornerstone content addresses fundamental aspects of a subject that do not quickly become outdated.
  4. Interlinked: Cornerstone content is often interlinked with other pages on the website. It acts as a hub that connects to and supports related articles and posts, creating a cohesive structure.
  5. SEO-Focused: Cornerstone content is essential for search engine optimization (SEO). When well-optimized, it can attract organic traffic by targeting key search terms and providing valuable information that search engines recognize as authoritative.

Examples of cornerstone content may include comprehensive guides, tutorials, or in-depth analyses related to the central themes of a website. For instance, a fitness website might have a cornerstone article on “The Ultimate Guide to Building Muscle,” while a travel blog could have a cornerstone piece on “Essential Tips for Budget Travelers.”

Creating and maintaining cornerstone content is a strategic approach for improving a website’s visibility, authority, and user experience. It helps the site establish a strong presence in search engine results and provides valuable resources for visitors seeking comprehensive information.

Error – Invalid sitemap URL detected; syntax not understood

Allow: sitemap: [Optional, zero or more per file] The location of a sitemap for this site. The sitemap URL must be a fully-qualified URL; Google doesn’t assume or check http/https/www.non-www alternates. Sitemaps are a good way to indicate which content Google should crawl, as opposed to which content it can or cannot crawl.

Before

User-agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /refer/
Disallow: /tag/
Disallow: /feed/
Disallow: /author/admin/

Sitemap: https://josuamarcelc.com/sitemap_index.xml
Sitemap: https://josuamarcelc.com/image_sitemap_20221010.xml
Sitemap: https://josuamarcelc.com/page-sitemap.xml
Sitemap: https://josuamarcelc.com/post-sitemap.xml
Sitemap: https://josuamarcelc.com/web-story-sitemap.xml

How to fix this invalid sitemap is only add the Allow: before the Sitemap: xxx.com/sitemap.xml

Update the robots.txt and clear the domain cache then request recrawl on Setting -> robots.txt -> OPEN REPORT

(already tested and it’s good)

User-agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /refer/
Disallow: /tag/
Disallow: /feed/
Disallow: /author/admin/

Allow: Sitemap: https://josuamarcelc.com/sitemap_index.xml
Allow: Sitemap: https://josuamarcelc.com/image_sitemap_20221010.xml
Allow: Sitemap: https://josuamarcelc.com/page-sitemap.xml
Allow: Sitemap: https://josuamarcelc.com/post-sitemap.xml
Allow: Sitemap: https://josuamarcelc.com/web-story-sitemap.xml

How To Fix Robots.txt Reference: https://support.google.com/webmasters/thread/246133059/error-invalid-sitemap-url-detected-syntax-not-understood-line-3?hl=en

Bluescreen wdiwifi.sys on Z490 Aorus Master MotherBoard

Have you ever encountered the dreaded Blue Screen of Death (BSoD) on your Windows computer? If so, you may have come across an error related to the wdiwifi.sys file. This article will delve into what exactly the wdiwifi.sys BSoD is, its causes, and most importantly, how to fix it.

Understanding the wdiwifi.sys BSoD Error

The wdiwifi.sys file is a system driver associated with the Wi-Fi Direct feature in Windows operating systems. It allows users to establish direct wireless connections between devices without the need for a traditional Wi-Fi network. However, sometimes this driver can cause issues, leading to the infamous BSoD.

When the wdiwifi.sys file encounters an error or becomes corrupted, it triggers a BSoD, which is a critical system error that forces your computer to restart. The BSoD screen typically displays an error message along with a stop code, such as “SYSTEM_THREAD_EXCEPTION_NOT_HANDLED” or “DRIVER_IRQL_NOT_LESS_OR_EQUAL.”

Causes of wdiwifi.sys BSoD

Several factors can contribute to the occurrence of the wdiwifi.sys BSoD error. Here are some common causes:

  • Outdated or incompatible device drivers: If your Wi-Fi adapter’s driver is outdated or incompatible with your operating system, it can lead to conflicts and trigger the BSoD.
  • Malware or virus infections: Malicious software can corrupt system files, including the wdiwifi.sys driver, causing system instability and crashes.
  • Hardware issues: Faulty hardware components, such as a malfunctioning Wi-Fi adapter or incompatible RAM, can also result in the wdiwifi.sys BSoD error.
  • Software conflicts: Conflicts between different software applications or incompatible software versions can cause system errors, including the wdiwifi.sys BSoD.

Download the updated WIFI Driver, and all is well

https://www.gigabyte.com/Motherboard/Z490-AORUS-MASTER-rev-1x/support#support-dl-driver-wlanbt

Summary

The wdiwifi.sys BSoD error can be frustrating and disruptive to your computer usage. In this article, we explored what the wdiwifi.sys BSoD error is, its causes, and how to fix it. Remember to update your device drivers, scan for malware, and check your hardware components for any issues. By following these steps, you can resolve the wdiwifi.sys BSoD error and ensure a stable and reliable computing experience.

greedisgood Greed Is Good | Warcraft III Cheats | Gordon Gekko

Greed, for lack of a better word, is good. Greed is right, greed works. Greed clarifies, cuts through, and captures the essence of the evolutionary spirit. Greed, in all of its forms; greed for life, for money, for love, knowledge has marked the upward surge of mankind. And greed, you mark my words, will not only save Teldar Paper, but that other malfunctioning corporation called the USA. Thank you very much.

​Warcraft III cheats

To use a cheat code, press the [enter] key, type in the code, and press enter again. The message “Cheat Enabled!” should appear. These codes only work in single-player missions and custom maps. These codes are NOT case-sensitive.
TenthLevelTaurenChieftainPlays Power of the horde By Tenth Level Tauren Chieftain. (L70ETC)
WarpTenSpeeds construction of buildings and units
IocainePowderFast Death/Decay
WhosYourDaddyMakes you and your units invincible and have one hit kills
KeyserSoze [amount]Gives you X Gold
LeafitToMe [amount]Gives you X Lumber
GreedIsGood [amount]Gives you X Gold and Lumber
PointBreakRemoves food limit
ThereIsNoSpoonUnlimited Mana
StrengthAndHonorContinue playing after defeat in campaign mode
Motherland [race][1] [level][2]level jump
SomebodySetUsUpTheBombInstant defeat
AllYourBaseAreBelongToUsInstant victory
ItVexesMeCan’t win
WhoIsJohnGaltEnable research
SharpAndShinyResearch upgrades
IseeDeadPeopleRemove fog of war
SynergyDisable tech tree requirements
RiseAndShineSet time of day to dawn
LightsOutSet time of day to dusk
DayLightSavings [time]If a time is specified, time of day is set to that, otherwise time of day is instead halted/resumed
TheDudeAbidesResets all cooldowns

Remove Apache2 from Ubuntu Completely

A very simple and straightforward way that worked for me since 2015 to now, is as follows:

sudo service apache2 stop
sudo apt-get purge apache2* -y
sudo apt-get autoremove -y
sudo rm -rf /usr/sbin/apache2 /usr/lib/apache2 /usr/share/apache2 /usr/share/man/man8/apache*
whereis apache2

Explaination:

  • sudo service apache2 stop
    • Stop the service
  • sudo apt-get purge apache2* -y
    • Uninstall the service
  • sudo apt-get autoremove -y
    • Cleaning the apt (advance packaging tools)
  • sudo rm -rf /usr/sbin/apache2 /usr/lib/apache2 /usr/share/apache2 /usr/share/man/man8/apache*
    • Remove the folders that created for Apache2
  • whereis apache2
    • Find Apache2 folders, if the results is Apache2: , means no more Apache2 in your Ubuntu Server

Arti Kata Caonima Chaonima?

Arti kata dari Caonima adalah Kuda Lumpur Rumput (Hanzi: 草泥马; Pinyin: cǎo ní mǎ) adalah meme Internet Tiongkok dan parodi kuso berdasarkan permainan kata kata-kata kotor Mandarin cào nǐ mā (肏你妈), yang secara harafiah berarti “persetan dengan ibumu”. Source dari Wikipedia

Source:
agal lancHiao tidak telputus, adakah selatus?
www.instagram.com/p/CyOB2WkR1oi/

Sebab, ada yang mengatakan Cào nǐ mā memiliki makna negatif. Kata ini dianggap vulgar, kasar dan tidak sopan karena mengandung unsur umpatan.

Lebih baik belajar yang lebih sopan seperti:

  • 对不起 (duìbùqǐ): Maaf
  • 你好 (nǐ hǎo): Halo.
  • 谢谢 (xièxiè): Terimakasih.
  • 大家好 (dàjiā hǎo): Halo Semuanya.
  • 再见 (zàijiàn): Sampai Jumpa Lagi.

Adapun bahasa slang internet lainnya:

  • 笑死我了(xiào sĭ wǒle) atau XSWL memiliki arti yang sama dengan bahasa Inggris (ROFL/roll on the floor laughing) dimana memiliki arti tertawa ngakak;
  • 装熟 (zhuāng shú) sama dengan istilah SKSD (sok kenal sok dekat);
  • 鸡婆 (jī pó) disebut sebagai kepo/suka ingin mengetahui urusan orang lain;
  • 小鲜肉 (xiǎo xiān ròu) artinya berondong/cowok muda ganteng;
  • 夸张 (kuā zhāng) artinya lebay;
  • 吃土 (chī tǔ) memiliki arti kere atau keuangan terbatas;
  • 吐槽 (tù cáo) memiliki arti membicarakan kelemahan orang lain;
  • 小鲜肉 (xiǎo xiān ròu) memiliki arti yang sebenarnya yaitu ‘daging segar kecil’. Namun saat ini, kata slang Mandarin ini digunakan untuk bintang KPop populer atau bisa juga untuk orang biasa namun memiliki kriteria seperti memiliki umur 12 hingga 25 tahun, berwajah imut, dan tampan.

Set Up OpenVPN Server In 3 Minutes

sudo apt update
sudo apt upgrade

ip a
ip a show eth0

dig +short myip.opendns.com @resolver1.opendns.com
dig TXT +short o-o.myaddr.l.google.com @ns1.google.com | awk -F'"' '{ print $2}'
wget https://git.io/vpn -O openvpn-install.sh
chmod +x openvpn-install.sh

sudo ./openvpn-install.sh

Sample session from AWS/Lightsail where my cloud server is behind NAT:

Sample session from Linode/DO server where cloud server has Direct Public

To avoid problem always choose DNS as 1.1.1.1 or Google DNS. Those are fast DNS server and reached from anywhere on the Internet.

Exit mobile version