Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

StackOverflow Point

StackOverflow Point Navigation

  • Web Stories
  • Badges
  • Tags
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Web Stories
  • Badges
  • Tags
Home/ Questions/Q 4301
Alex Hales
  • 0
Alex HalesTeacher
Asked: June 3, 20222022-06-03T16:42:32+00:00 2022-06-03T16:42:32+00:00

python – Why does the SQlite function executemany run slow with updating large amount of data?

  • 0

[ad_1]

I’m new to SQlite and I ran into a problem while trying to update 2 columns of a SQlite database for a sentiment analysis on tweets using the TextBlob library. I have to update 6.5 million rows and I want to do it as efficiently as possible. I used the following code for this.

conn = sqlite3.connect('tweets.db')
c = conn.cursor() 

from textblob import TextBlob

english_query = """
    SELECT tweet_id, text
    FROM tweetInfo
    WHERE lang = 'en'
"""
c.execute(english_query)
e_tweets = c.fetchall()
conn.commit()

data_list = []

for i in range(len(e_tweets)):
    small_list = [TextBlob(e_tweets[i][1]).sentiment.polarity, TextBlob(e_tweets[i][1]).sentiment.subjectivity, 
                    e_tweets[i][0]]
    data_list.append(small_list)

update_query = '''
    UPDATE tweetInfo
    SET polarity = ?, subjectivity = ?
    WHERE tweet_id = ?
'''
data = data_list 
c.executemany(update_query, data)
conn.commit()

I used the executemany function because I found online that it was suppossed to be fast with handeling large amount of data. The code seems to work fine, but it takes multiple hours to finish, so I’m wondering if I did anything wrong here with the executemany function or the code in general. Does anyone have a solution for this?

[ad_2]

  • 0 0 Answers
  • 1 View
  • 0 Followers
  • 0
Share
  • Facebook
  • Report
Leave an answer

Leave an answer
Cancel reply

Browse

Sidebar

Ask A Question

Related Questions

  • xcode - Can you build dynamic libraries for iOS and ...

    • 0 Answers
  • bash - How to check if a process id (PID) ...

    • 8087 Answers
  • database - Oracle: Changing VARCHAR2 column to CLOB

    • 1871 Answers
  • What's the difference between HEAD, working tree and index, in ...

    • 1957 Answers
  • Amazon EC2 Free tier - how many instances can I ...

    • 0 Answers

Stats

  • Questions : 43k

Subscribe

Login

Forgot Password?

Footer

Follow

© 2022 Stackoverflow Point. All Rights Reserved.

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.