Want to scraping data by Bright Data API? We make a some examples of code for popular programming languages, with it you can easily to know how ot use Bright Data proxy API.
Luminati proxy API
For Browser or Software,
Use the following settings in your Bot, Crawler or other software,
- Proxy: zproxy.lum-superproxy.io
- Port: 22225
- User: lum-customer-Your_username- Your_zonesetting
- Password: 0Yourpasswords0
Shell
curl --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-Your_username- Your_zonesetting:0Yourpasswords0 "http://lumtest.com/myip.json"
Node.js
#!/usr/bin/env node require('request-promise')({ url: 'http://lumtest.com/myip.json', proxy: 'http://lum-customer-Your_username- Your_zonesetting:[email protected]:22225', }) .then(function(data){ console.log(data); }, function(err){ console.error(err); }); Java package example; import org.apache.http.HttpHost; import org.apache.http.client.fluent.*; public class Example { public static void main(String[] args) throws Exception { System.out.println("To enable your free eval account and get " +"CUSTOMER, YOURZONE and YOURPASS, please contact " +"[email protected]"); HttpHost proxy = new HttpHost("zproxy.lum-superproxy.io", 22225); String res = Executor.newInstance() .auth(proxy, "lum-customer-Your_username- Your_zonesetting", "0Yourpasswords0") .execute(Request.Get("http://lumtest.com/myip.json").viaProxy(proxy)) .returnContent().asString(); System.out.println(res); } }
C#
using System; using System.Net; class Example { static void Main() { Console.WriteLine("To enable your free eval account and get CUSTOMER, " +"YOURZONE and YOURPASS, please contact [email protected]"); var client = new WebClient(); client.Proxy = new WebProxy("zproxy.lum-superproxy.io:22225"); client.Proxy.Credentials = new NetworkCredential("lum-customer-Your_username- Your_zonesetting", "0Yourpasswords0"); Console.WriteLine(client.DownloadString("http://lumtest.com/myip.json")); } }
VB
Imports System.Net Module Module1 Sub Main() Console.WriteLine("To enable your free eval account and get " & "CUSTOMER, YOURZONE and YOURPASS, please contact " & "[email protected]") Dim Client As New WebClient Client.Proxy = New WebProxy("http://zproxy.lum-superproxy.io:22225") Client.Proxy.Credentials = New NetworkCredential("lum-customer-Your_username- Your_zonesetting", "0Yourpasswords0") Console.WriteLine(Client.DownloadString("http://lumtest.com/myip.json")) End Sub End Module
PHP
<?php echo 'To enable your free eval account and get CUSTOMER, YOURZONE and ' .'YOURPASS, please contact [email protected]'; $curl = curl_init('http://lumtest.com/myip.json'); curl_setopt($curl, CURLOPT_PROXY, 'http://zproxy.lum-superproxy.io:22225'); curl_setopt($curl, CURLOPT_PROXYUSERPWD, 'lum-customer-Your_username- Your_zonesetting:0Yourpasswords0'); curl_exec($curl); ?>
Python
#!/usr/bin/env python print('If you get error "ImportError: No module named \'six\'" install six:\n'+\ '$ sudo pip install six'); print('To enable your free eval account and get CUSTOMER, YOURZONE and ' + \ 'YOURPASS, please contact [email protected]') import sys if sys.version_info[0]==2: import six from six.moves.urllib import request opener = request.build_opener( request.ProxyHandler( {'http': 'http://lum-customer-Your_username- Your_zonesetting:[email protected]:22225', 'https': 'http://lum-customer-Your_username- Your_zonesetting:[email protected]:22225'})) print(opener.open('http://lumtest.com/myip.json').read()) if sys.version_info[0]==3: import urllib.request opener = urllib.request.build_opener( urllib.request.ProxyHandler( {'http': 'http://lum-customer-Your_username- Your_zonesetting:[email protected]:22225', 'https': 'http://lum-customer-Your_username- Your_zonesetting:[email protected]:22225'})) print(opener.open('http://lumtest.com/myip.json').read())
Ruby
#!/usr/bin/ruby require 'uri' require 'net/http' require 'net/https' puts 'To enable your free eval account and get CUSTOMER, YOURZONE and YOURPASS, please contact [email protected]' uri = URI.parse('http://lumtest.com/myip.json') proxy = Net::HTTP::Proxy('zproxy.lum-superproxy.io', 22225, 'lum-customer-Your_username- Your_zonesetting', '0Yourpasswords0') req = Net::HTTP::Get.new(uri.path) result = proxy.start(uri.host,uri.port, :use_ssl => uri.scheme == 'https') do |http| http.request(req) end puts result.body
Perl
#!/usr/bin/perl print 'To enable your free eval account and get CUSTOMER, YOURZONE and ' .'YOURPASS, please contact [email protected]'; use LWP::UserAgent; my $agent = LWP::UserAgent->new(); $agent->proxy(['http', 'https'], "http://lum-customer-Your_username- Your_zonesetting:0Yourpasswords0\@zproxy.lum-superproxy.io:22225"); print $agent->get('http://lumtest.com/myip.json')->content();
Search Engine Crawler API
Google Search Crawler API
Search
Localization
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&gl=us"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&hl=us"
Type of search
tbm=isch – images
tbm=shop – shopping
[%nws] – news
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&tbm=shop"
ibp=htl;jobs – Jobs
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&ibp=htl%3Bjobs"
Pagination
start=0 (default) – first page of results
start=10 – second page of results
start=20 – third page of results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&start=10"
num=10 (default) returns 10 results
num=30 returns 30 results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&num=100"
Geographic Location
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&uule=w+CAIQICINVW5pdGVkK1N0YXRlcw"
Device and output format
Default or lum_mobile=0 will provide random desktop user-agent while lum_mobile=1 will provide random mobile user-agent
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PA
lum_json=1 – return results in JSON
lum_json=html – return JSON with “html” field containing raw HTML
lum_json=hotel – make additional request to retrieve hotel prices
lum_json=hotel,html – two values can be combined while separated by comma
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/search?q=pizza&lum_json=1"
Search by image
default – download-and-post if google isn't able to download the image
download=1 – force download-and-post image
download=0 – regular GET request with image URL
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/searchbyimage?image_url=https%3A%2F%2Flive.staticflickr.com%2F213%2F491726079_4f46636859_w.jpg&download=1"
Multiple requests
multi=[{“keyword”:”pizza”,”num”:20},{“keyword”:”pizza”,”num”:100}] – the same keyword with different num param
multi=[{“keyword”:”pizza”},{“keyword”:”burger”}] – different keywords
curl -v --compressed "https://brightdata.com/api/serp/req" -H "Content-Type: application/json" -H "Authorization: Bearer API_TOKEN" -d "{\"country\":\"us\",\"multi\":[{\"keyword\":\"pizza\",\"num\":20},{\"keyword\":\"pizza\",\"num\":100}]}"
Asynchronous requests
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/req?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"q\":\"pizza\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
`x-response-id` header will contain the id of the request so you can use it in the next request to fetch the result
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
You can adjust your zone async settings on zones page.
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
Parsing schema
GET /api/serp/google/parsing_schema
To get schema, perform:
curl --compressed "https://brightdata.com/api/serp/google/parsing_schema" -H "Authorization: Bearer API_TOKEN"
Maps
Localization
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/maps/search/restaurants+new+york/?gl=us"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/maps/search/restaurants+new+york/?hl=en"
Pagination
start=0 (default) – first page of results
start=20 – second page of results
start=40 – third page of results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/maps/search/restaurants+new+york/?start=20"
num=40 (default) returns 40 results
num=50 returns 50 results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/maps/search/restaurants+new+york/?num=40"
Output format
lum_json=1 – return results in JSON
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/maps/search/restaurants+new+york/?lum_json=1"
Trends
Geo
geo Location of interest, two-letter country code
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://trends.google.com/trends/explore?q=pizza&geo=us"
Localization
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://trends.google.com/trends/explore?q=pizza&hl=de"
Time range
now 1-H – Past hour
now 4-H – Past 4 hours
now 1-d – Past day
now 7-d – Past 7 days
today 1-m – Past 30 days
today 3-m – Past 90 days
today 12-m – Past 12 months
today 5-y – Past 5 years
2020-07-01 2020-12-31 – custom date range
curl -v --compressed --proxy zproxy.lum-
You can find list of all categories here.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http:/
Possible values are: images, news, groogle, youtube
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "htt
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/trends?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"q\":\"pizza\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
`x-response-id` header will contain the id of the request so you can use it in the next request to fetch the result
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&output=json&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
You can adjust your zone async settings on zones page.
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
Reviews
fid parameter can be found in knowledge.fid field of google search response.
For example: http://www.google.com/search?q=hilton%20new%20york%20midtown
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/reviews?fid=0x808fba02425dad8f%3A0x6c296c66619367e0"
Localization
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/reviews?hl=de&fid=0x808fba02425dad8f%3A0x6c296c66619367e0"
Sorting and filtering
sort=newestFirst – newest first
sort=ratingHigh – highest rating first
sort=ratingLow – lowest rating first
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/reviews?fid=0x808fba02425dad8f%3A0x6c296c66619367e0&sort=newestFirst"
filter=awesome – search for reviews containing ‘awesome' word
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/reviews?filter=awesome&fid=0x808fba02425dad8f%3A0x6c296c66619367e0"
Pagination
start=0 (default) – first page of results
start=10 – second page of results
start=20 – third page of results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.google.com/reviews?start=10&fid=0x808fba02425dad8f%3A0x6c296c66619367e0"
Asynchronous requests
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/reviews?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"fid\":\"0x808fba02425dad8f:0x6c296c66619367e0\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
`x-response-id` header will contain the id of the request so you can use it in the next request to fetch the result
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
You can adjust your zone async settings on zones page.
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
Bing Search Crawler API
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&cc=us"
Geographic Location
lat and lon parameters must be specified as well
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&location=New+York%2C+New+York%2C+United+States&lat=40.7001958&lon=-74.1087142"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&lat=40.7001958&lon=-74.1087142"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&lat=40.7001958&lon=-74.1087142"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&mkt=en-US"
Pagination
first=1 (default) – first page of results
first=11 – second page of results
first=21 – third page of results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&first=11"
count=10 (default) returns 10 results
count=30 returns 30 results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&num=20"
Device and output format
Default or lum_mobile=0 will provide random desktop user-agent while lum_mobile=1 will provide random mobile user-agent
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&lum_mobile=1"
lum_json=1 – return results in JSON
lum_json=html – return JSON with “html” field containing raw HTML
lum_json=hotel – make additional request to retrieve hotel prices
lum_json=hotel,html – two values can be combined while separated by comma
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "http://www.bing.com/search?q=pizza&lum_json=1"
Asynchronous requests
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/bing/search?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"q\":\"pizza\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
Parsing schema
GET /api/serp/bing/parsing_schema
To get schema, perform:
curl --compressed "https://brightdata.com/api/serp/bing/parsing_schema" -H "Authorization: Bearer API_TOKEN"
Yandex Search Crawler API
2 – Saint-Petersburg
84 – USA
95 – Canada
134 – China
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&lr=84"
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&lang=en"
Pagination
p=1 (default) – first page of results
p=2 – second page of results
p=4 – fourth page of results, etc.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&p=1"
Takes the values 10, 20, 30, and 10. When using numbers other than these values which less than 50, rounding up to the nearest of them occurs. When using numbers over 50, rounding up to 50 occurs.
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&numdoc=10"
Time range
2 – Past month
3 – Past 3 months
4 – Past 6 months
5 – Past year
6 – Past 2 years
7 – Past day
77 – Past 24 hours
8 – Past 3 days
9 – Past week
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&within=1"
from_date_full=12.12.2020 – date to start from
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&from_date_full=12.12.2020"
to_date_full=01.01.2021 – date to end search
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?text=pizza&to_date_full=01.01.2021"
Device and output format
Default or lum_mobile=0 will provide random desktop user-agent while lum_mobile=1 will provide random mobile user-agent
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://www.yandex.com/search/?lum_mobile=1&text=pizza"
Asynchronous requests
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/yandex/search?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"text\":\"pizza\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
`x-response-id` header will contain the id of the request so you can use it in the next request to fetch the result
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
You can adjust your zone async settings on zones page.
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
DuckDuckGo Search Crawler API
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://duckduckgo.com/?q=pizza&kl=us-en"
Safe search
1 – Turn on safe search
3 – Turn off safe search
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://duckduckgo.com/?q=pizza&kp=1"
Time range
w – Past month
m – Past 3 months
w – Past 6 months
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://duckduckgo.com/?q=pizza&df=d"
Device and output format
Default or lum_mobile=0 will provide random desktop user-agent while lum_mobile=1 will provide random mobile user-agent
curl -v --compressed --proxy zproxy.lum-superproxy.io:22225 --proxy-user lum-customer-username-zone-ZONE:PASSWORD "https://duckduckgo.com/?q=pizza&lum_mobile=1"
Asynchronous requests
To initiate a request, perform:
RESPONSE_ID=`curl -i --silent --compressed "https://brightdata.com/api/serp/duckduckgo/search?customer=username&zone=ZONE" -H "Content-Type: application/json" -H "Authorization: Bearer <API_TOKEN>" -d "{\"query\":{\"q\":\"pizza\"},\"country\":\"us\"}" | sed -En 's/^x-response-id: (.*)//p' | tr -d '\r'`
`x-response-id` header will contain the id of the request so you can use it in the next request to fetch the result
curl -v --compressed "https://brightdata.com/api/serp/get_result?customer=username&zone=ZONE&response_id=${RESPONSE_ID}" -H "Authorization: Bearer <API_TOKEN>"
You can adjust your zone async settings on zones page.
Optional settings:
Result lifetime (days) – number of days to keep results
Web Hook URL – address to deliver results
Web Hook Request Method – HTTP method to use to deliver response, GET or POST are optional
Resource,
Disclaimer: This part of the content is mainly from the merchant. If the merchant does not want it to be displayed on my website, please contact us to delete your content.
Last Updated on October 10, 2024