Chat with us, powered by LiveChat


by srakute
0 comment

Selam aleykum this is a video about subscribing I’ll leave a link below of the space to read about it I’ll give you an example about Amazon and how can we get data a project name for example its price without even going to the website this can be used

for different reasons maybe you are looking for something and waiting for its price to drop down then you can make a software to do that for you let’s tick and it let’s take an example we’ll select run to the product it’s a mouse the name and the manufacturer

of the mouse is here the price also appears here we can also extract the pictures if if you want to but we were not going to do that now in order to do that I’ll use Python before we begin we need to know how are we going to

connect our software to this to this web page we’re gonna use request library which is a library available for free to be installed and used in python and what this library does is make a connection between the software or the program that you are making writing and of

a page that you want to subscribe spread data from it gets the files or the code on that page with HTML or L XML and give you back everything for you to use however we’re going to use another library that we it will enable us to search in

the text or the file that request port us back from the website so beautiful soup is basically another library that will give you wave ways to navigate and search and modify the parse tree for example here in this page we have the name of the item so we

have to look for the name and what it is on the page and then bring it back so we have to search for it we can also allows that also we need to search for the price and get it back and we’ll do that using beautiful suit so

the first thing we would have to write is to import request library I’m sure we continue if you don’t have it then you have to install it and you can use that by tabbing in the terminal naughty really cold hip and store requests but they’re already in storage

so we don’t have to write this right now in the terminal the second level we have to import is as we said earlier is a beautiful soup and that one is actually is a class of our pickle library which is called ps4 and from ps4 will import beautiful

so now since we have the libraries ready we can start and use them go ahead and use them so I will start first of all with writing the URL already address that we have to connect to and that will be the page of the product on Amazon so

I’ll copy that to the code bases here so now we have the URL that we are going to connect you then I will ask request to go to to go ahead and do the connection make a connection so I will say requests dot I want to get data

from the beach so I will use get method and between these two brackets I will put a URL which is the value above here now because I want to get that response and save it I will put it inside a variable and call it responds and actually let’s

try it before we do so I will click on run the sales request is not defined of course not defined s e quests let’s go ahead run it again and no errors so it worked it made a connection and saved response in a vital code response let’s print

it and see what gave it but it gives us response so it basically gives me a response that there’s an error 5:03 if we go and search for that I’ll just copy go ahead here and paste it but it says the cyst as a surface and available without

going to any further details and we have to know why we need to know why so I’ll print the text of the response actually what does it have its insiders taste right again this is what I get all right we have actually made a connection to the website

and we can see that we made you get a response look at the response and it’s HTML file and we go ahead and read find that actually there was a connection but the problem was that the server’s the Amazon servers are responding with this message our servers are

getting hit pretty hard right now good to me shall be entered the characters are they are shown in the image below so basically this is definite defense mechanism from the server’s protecting themselves from being attacked by too many requests on this in the same time so basically we

have to tell them that this is not an attack this is browser trying to get information about the product itself so we have to pass another parameter here which is headers you know let’s do that let’s go ahead and search for Python that there’s requests I think we

got this one so here basically we’ll copy this one the header is here and these headers are part of the HTML request that we are sending it says that we are using a browser whatever it is from one of these so don’t wash over we are fine and

here I will write the parameter get articles letters which is this file now let’s try again and click one now we still have the print response dot txt there we can see what we got a lot of text here this is me this means that we we got

up the page final proof is that we have cats that’s saying we have here now there is a lot of text here you are not going to search for it line by line so we’ll use the beautifulsoup to search for the element that contains the name of the

project and then the price street this phone for now and let’s create a variable called soup and inside that will store a soup which is made from the response that we made already of course with the text of the response now the outside inside that’s so I need

to find the place that contains the name of the product so I’ll go back here to the product page and if you’re using Chrome or whatever browser it is nowadays most most of them has this tool is called an installed aspect if you right clicking on the name

of the product and right click then selected inspect it will take it directly here to the place in the code that has the product name you can see it’s a side attack called this pan and this van has an ID of product title let’s go back again to

the code and try to find that I’ll create a variable is called product in and basically we want to search the soup so write soup not find underscore oh and what I want to find is a span that has the ID of I’ll copy that so I don’t

have mistakes or Tigers its product title and let’s see what it returns to us alright I want you to print and see what’s the product so there’s an orange double it’s a column and here we go we find a response that actually matches what we saw there on

the page ice pan that has the name of the product with other with the idea of product ID now I don’t want the whole span just want the item that contains the text so as you can see this is a list so to enter the list I’ll select

the first item and the let’s start there counting starts from zero and basically I want the text inside that item let’s print or run that again let’s see what we get here we go we got the name only without anything else which is great you can do something

more and type stripped to remove any spaces around it here we go now it’s just the text without any spaces before it or after so this is the first one this is the product name in the same way we did the product price I create a variable and

it’s called price this equals again the soup that is a search inside it and find all and we see we’re gonna find what there’s a mouse you can click on it for inspection it’s the same tool and you can see highlights the items inside the code wherever you

hover so we want to we want the price we click on the price it will take us to the item that contains the price of the product inside the HTML code basically the it’s the same item it’s a span but now the idea is different its price look

till price since it’s on the discount so we go back here copy this one basically I’ll copy everything up to here it is the same but now instead of romantic title I’ll copy the ID from here paste it inside or a state of product product title let’s actually

print it along with the first variable this is price them and let’s run the code there we go we have the name of the product and the price so this is how you make a request using Python

You may also like

Leave a Comment