logo
down
shadow

Facebook API collect pages in certain country


Facebook API collect pages in certain country

By : user2949018
Date : November 15 2020, 06:54 AM
fixed the issue. Will look into that further There's no endpoint to search for Pages in certain countries. Have a look at
https://developers.facebook.com/docs/graph-api/using-graph-api/v2.2#search
code :


Share : facebook icon twitter icon
Facebook Like Box does not redirect users to Facebook pages as Facebook Like button does

Facebook Like Box does not redirect users to Facebook pages as Facebook Like button does


By : Merthin Fitzgerald
Date : March 29 2020, 07:55 AM
may help you . The issue is that the edge.create event is not fired by the Like Box, so you will not be able to get that to work. I suggest you add your voice to the following bug report about this problem:
https://developers.facebook.com/bugs/310763168934515
save email, country, ..., full name from facebook to database with login with facebook

save email, country, ..., full name from facebook to database with login with facebook


By : mfm2000
Date : March 29 2020, 07:55 AM
I wish this helpful for you You are trying to login the user even if the user is already connected. This creates the infinite loop cycles.
code :
// Here we specify what we do with the response anytime this event occurs. 
if (response.status === 'connected') {
  // The response object is returned with a status field that lets the app know the current
  // login status of the person. In this case, we're handling the situation where they 
  // have logged in to the app.
  // testAPI();

    FB.login(function(response) {
      if (response.session == 'connected' && response.scope) {
        FB.api('/me',  function(response) {
            window.location = "http://www.mywebsite.com/checkloginfb.php?email=" + response.email;
          }
        );
      }
    } , {scope: 'email'}); 
} 
Collect data from facebook pages

Collect data from facebook pages


By : user2848796
Date : March 29 2020, 07:55 AM
I hope this helps you .
My question is how do I get a list of IDs of all facebook pages ?
collect text from web pages of a given site

collect text from web pages of a given site


By : Eilif Johansen
Date : March 29 2020, 07:55 AM
To fix the issue you can do You may read the BeautifulSoup manual first and also learn to use web developper tool to inspect network flow.
Once done, you may see that you can get the list of house with a GET request http://www.apartmenttherapy.com/search?page=1&q=House+Tour&type=all
code :
import time
import requests
import random
from bs4 import BeautifulSoup  

#here we get a list of all url to scrap
url_list=[]
max_index=2 

for page_index in range(1,max_index):

    #get index page
    html=requests.get("http://www.apartmenttherapy.com/search?page="+str(page_index)+"&q=House+Tour&type=all").content

    #iterate over teaser
    for teaser in BeautifulSoup(html).findAll('a',{'class':'SimpleTeaser'}):

        #add link to url list
        url_list.append(teaser['href'])

    #sleep a litte to avoid overload/ to be smart
    time.sleep(random.random()/2.) # respect server side load

    #here I break because it s just an example (it does not required to scrap all index page)
    break #comment this break in production


#here we show list  
print url_list


#we iterate over url to get the advice
mylist=[]
for url in url_list:

    #get teaser page
    html=requests.get(url).content

    #find best advice text
    hello = BeautifulSoup(html).find(text='Best Advice: ')

    #print advice
    print "advice for",url,"\n","=>",

    #try to add next text to mylist
    try:
        mylist.append(hello.next)
    except:
        pass

    #sleep a litte to avoid overload/ to be smart
    time.sleep(random.random()/2.) # respect server side load

#show list of advice
print mylist
['http://www.apartmenttherapy.com/house-tour-a-charming-comfy-california-cottage-228229', 'http://www.apartmenttherapy.com/christinas-olmay-oh-my-house-tour-house-tour-191725', 'http://www.apartmenttherapy.com/house-tour-a-rustic-refined-ranch-house-227896', 'http://www.apartmenttherapy.com/caseys-grown-up-playhouse-house-tour-215962', 'http://www.apartmenttherapy.com/allison-and-lukes-comfortable-and-eclectic-apartment-house-tour-193440', 'http://www.apartmenttherapy.com/melissas-eclectic-austin-bungalow-house-tour-206846', 'http://www.apartmenttherapy.com/kates-house-tour-house-tour-197080', 'http://www.apartmenttherapy.com/house-tour-a-1940s-art-deco-apartment-in-australia-230294', 'http://www.apartmenttherapy.com/house-tour-an-art-filled-mid-city-new-orleans-house-227667', 'http://www.apartmenttherapy.com/jeremys-light-and-heavy-home-house-tour-201203', 'http://www.apartmenttherapy.com/mikes-cabinet-of-curiosities-house-tour-201878', 'http://www.apartmenttherapy.com/house-tour-a-family-dream-home-in-illinois-227791', 'http://www.apartmenttherapy.com/stephanies-greenwhich-gemhouse-96295', 'http://www.apartmenttherapy.com/masha-and-colins-worldly-abode-house-tour-203518', 'http://www.apartmenttherapy.com/tims-desert-light-box-house-tour-196764']
advice for http://www.apartmenttherapy.com/house-tour-a-charming-comfy-california-cottage-228229 
=> advice for http://www.apartmenttherapy.com/christinas-olmay-oh-my-house-tour-house-tour-191725 
=> advice for http://www.apartmenttherapy.com/house-tour-a-rustic-refined-ranch-house-227896 
=> advice for http://www.apartmenttherapy.com/caseys-grown-up-playhouse-house-tour-215962 
=> advice for http://www.apartmenttherapy.com/allison-and-lukes-comfortable-and-eclectic-apartment-house-tour-193440 
=> advice for http://www.apartmenttherapy.com/melissas-eclectic-austin-bungalow-house-tour-206846 
=> advice for http://www.apartmenttherapy.com/kates-house-tour-house-tour-197080 
=> advice for http://www.apartmenttherapy.com/house-tour-a-1940s-art-deco-apartment-in-australia-230294 
=> advice for http://www.apartmenttherapy.com/house-tour-an-art-filled-mid-city-new-orleans-house-227667 
=> advice for http://www.apartmenttherapy.com/jeremys-light-and-heavy-home-house-tour-201203 
=> advice for http://www.apartmenttherapy.com/mikes-cabinet-of-curiosities-house-tour-201878 
=> advice for http://www.apartmenttherapy.com/house-tour-a-family-dream-home-in-illinois-227791 
=> advice for http://www.apartmenttherapy.com/stephanies-greenwhich-gemhouse-96295 
=> advice for http://www.apartmenttherapy.com/masha-and-colins-worldly-abode-house-tour-203518 
=> advice for http://www.apartmenttherapy.com/tims-desert-light-box-house-tour-196764 
=> [u"If you make a bad design choice or purchase, don't be afraid to change it. Try and try again until you love it.\n\t", u" Sisal rugs. They clean up easily and they're very understated. Start with very light colors and add colors later.\n", u"Bring in what you love, add dimension and texture to your walls. Decorate as an individual and not to please your neighbor or the masses. Trends are fun but I love elements of timeless interiors. Include things from any/every decade as well as mixing styles. I'm convinced it's the hardest way to decorate without looking like you are living in a flea market stall. Scale, color, texture, and contrast are what I focus on. For me it takes some toying around, and I always consider how one item affects the next. Consider space and let things stand out by limiting what surrounds them.", u'You don\u2019t need to invest in \u201cdecor\u201d and nothing needs to match. Just decorate with the special things (books, cards, trinkets, jars, etc.) that you\u2019ve collected over the years, and be organized. I honestly think half the battle of having good home design is keeping a neat house. The other half is just displaying stuff that is special to you. Stuff that has a story and/or reminds you of people, ideas, and places that you love. One more piece of advice - the best place to buy picture frames is Goodwill. Pick a frame in decent condition, and just paint it to complement your palette. One last piece of advice\u2014 decor need not be pricey. I ALWAYS shop consignment and thrift, and then I repaint and customize as I see fit.\n', u'From my sister \u2014 to use the second bedroom as my room, as it is dark and quiet, both of which I need in order to sleep.\n', u'Collect things that you love in your travels throughout life. I tend to purchase ceramics when travelling, sometimes a collection of bowls\u2026 not so easy transporting in the suitcase, but no breakages yet!\n\t', u'Keep things authentic to the character of your home and to the character of your family. Then, you can never go wrong!\n\t', u'Contemporary architecture does not require contemporary furnishings.\n']
selenium and multiprocessing, collects duplicate pages and does not collect certain pages

selenium and multiprocessing, collects duplicate pages and does not collect certain pages


By : user3404319
Date : March 29 2020, 07:55 AM
this one helps. I have some issues with the below code, it collects duplicate pages and does not collect certain pages in the url example I have pagination of 19 pages, and for example it collects comments from the page 2 and the same comments when it is in the page 3, and doesnt collect the comments of the page 3 ,
collects duplicate pages and does not collect certain pages
code :
ThreadPool(10).map(get_comments, [url])
# executed in parallel
function(args[0]) # possibly in first executor
function(args[1]) # possibly in second executor
...
def print_comments(url, pages):
    browser = webdriver.Firefox(executable_path="geckodriver")
    browser.get(url)
    for page in pages:
        wait(browser, 10).until(EC.element_to_be_clickable((By.XPATH, "//span[text()='" + str(page) + "']"))).click()
        soup = bs(browser.page_source, "html.parser")
        comments = soup.findAll("div", class_="commentaire-card-left")
        for comment in comments:
            print(comment.find("p").text)
            print(comment.find("cite").text)
    browser.quit()


def print_all_comments(url):
    browser = webdriver.Firefox(executable_path="geckodriver")
    browser.get(url)
    soup = bs(browser.page_source, "html.parser")
    browser.quit()

    lastPage = soup.findAll("span", class_="page")[-1].text
    step = math.ceil(int(lastPage) / 3 )
    with ThreadPoolExecutor(max_workers=3) as executor:
        for start_page in range(1, int(lastPage) + 1, step):
            page_numbers = list(range(start_page, start_page + step))
            executor.submit(print_comments, url, page_numbers)


url = "https://www.mesopinions.com/petition/politique/stop-massacre-nos-artisans-annulez-redressement/74954/page14?commentaires-list=true"
print_all_comments(url)
shadow
Privacy Policy - Terms - Contact Us © ourworld-yourmove.org