Skip to content
Discussion options

You must be logged in to vote

You are loading the model in each call to sentiment_analysis and loading the model takes a fair amount of time. I'd suggest taking the model load out of the function and passing the model in as an argument:

def sentiment_analysis(nlp, text):
    doc = nlp(text)
    
    # get the sentiment score
    sentiment_score = doc._.blob.polarity
    # get the sentiment label
    if sentiment_score > 0.4:
        sentiment_label = "Positive"
    elif sentiment_score < 0:
        sentiment_label = "Negative"
    else:
        sentiment_label = "Neutral"
    return sentiment_score, sentiment_label`

nlp = en_core_web_lg.load()
nlp.add_pipe("spacytextblob")

sample_data['sentiment_score'], sample_data…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@marcello-calabrese
Comment options

Answer selected by danieldk
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usage General spaCy usage perf / speed Performance: speed
2 participants