Semi-Supervised Learning and Text AnalysisMachine Learning 10-701November 29, 2005Tom M. MitchellCarnegie Mellon UniversityDocument Classification: Bag of Words Approachaardvark 0about 2all 2Africa 1apple 0anxious 0...gas 1...oil 1…Zaire 0For code, seewww.cs.cmu.edu/~tom/mlbook.htmlclick on “Software and Data”Supervised Training for Document Classification• Common algorithms:– Logistic regression, Support Vector Machines, Bayesian classifiers• Quite successful in practice– Email classification (spam, foldering, ...)– Web page classification (product description, publication, ...)– Intranet document organization• Research directions:– More elaborate, domain-specific classification models (e.g., for email)– Using unlabeled data too Æ semi-supervised methodsEM for Semi-supervised document classificationUsing Unlabeled Data to Help Train Naïve Bayes ClassifierYX1X4X3X21010?0110?010000010011001X4X3X2X1YLearn P(Y|X)From [Nigam et al., 2000]E Step:M Step:wtis t-th word in vocabularyElaboration 1: Downweight the influence of unlabeled examples by factor λNew M step:Chosen by cross validationUsing one labeled example per class20 Newsgroups20 NewsgroupsEM for Semi-Supervised Doc Classification• If all data is labeled, corresponds to Naïve Bayes classifier• If all data unlabeled, corresponds to mixture-of-multinomial clustering• If both labeled and unlabeled data, it helps if and only if the mixture-of-multinomial modeling assumption is correct• Of course we could extend this to Bayes net models other than Naïve Bayes (e.g., TAN tree)Bags of Words, orBags of Topics?LDA: Generative model for documents[Blei, Ng, Jordan 2003]Also extended to case where number of topics is not known in advance (hierarchical Dirichlet processes – [Blei et al, 2004])Clustering words into topics withHierarchical Topic Models (unknown number of clusters)[Blei, Ng, Jordan 2003]Probabilistic model for generating document D:1. Pick a distribution P(z|θ) of topics according to P(θ|α)2. For each word w• Pick topic z from P(z | θ)• Pick word w from P(w |z, φ)Training this model defines topics (i.e., φ which defines P(W|Z))STORYSTORIESTELLCHARACTERCHARACTERSAUTHORREADTOLDSETTINGTALESPLOTTELLINGSHORTFICTIONACTIONTRUEEVENTSTELLSTALENOVELMINDWORLDDREAMDREAMSTHOUGHTIMAGINATIONMOMENTTHOUGHTSOWNREALLIFEIMAGINESENSECONSCIOUSNESSSTRANGEFEELINGWHOLEBEINGMIGHTHOPEWATERFISHSEASWIMSWIMMINGPOOLLIKESHELLSHARKTANKSHELLSSHARKSDIVINGDOLPHINSSWAMLONGSEALDIVEDOLPHINUNDERWATERDISEASEBACTERIADISEASESGERMSFEVERCAUSECAUSEDSPREADVIRUSESINFECTIONVIRUSMICROORGANISMSPERSONINFECTIOUSCOMMONCAUSINGSMALLPOXBODYINFECTIONSCERTAINExample topicsinduced from a large collection of textFIELDMAGNETICMAGNETWIRENEEDLECURRENTCOILPOLESIRONCOMPASSLINESCOREELECTRICDIRECTIONFORCEMAGNETSBEMAGNETISMPOLEINDUCEDSCIENCESTUDYSCIENTISTSSCIENTIFICKNOWLEDGEWORKRESEARCHCHEMISTRYTECHNOLOGYMANYMATHEMATICSBIOLOGYFIELDPHYSICSLABORATORYSTUDIESWORLDSCIENTISTSTUDYINGSCIENCESBALLGAMETEAMFOOTBALLBASEBALLPLAYERSPLAYFIELDPLAYERBASKETBALLCOACHPLAYEDPLAYINGHITTENNISTEAMSGAMESSPORTSBATTERRYJOBWORKJOBSCAREEREXPERIENCEEMPLOYMENTOPPORTUNITIESWORKINGTRAININGSKILLSCAREERSPOSITIONSFINDPOSITIONFIELDOCCUPATIONSREQUIREOPPORTUNITYEARNABLE[Tennenbaum et al]STORYSTORIESTELLCHARACTERCHARACTERSAUTHORREADTOLDSETTINGTALESPLOTTELLINGSHORTFICTIONACTIONTRUEEVENTSTELLSTALENOVELMINDWORLDDREAMDREAMSTHOUGHTIMAGINATIONMOMENTTHOUGHTSOWNREALLIFEIMAGINESENSECONSCIOUSNESSSTRANGEFEELINGWHOLEBEINGMIGHTHOPEWATERFISHSEASWIMSWIMMINGPOOLLIKESHELLSHARKTANKSHELLSSHARKSDIVINGDOLPHINSSWAMLONGSEALDIVEDOLPHINUNDERWATERDISEASEBACTERIADISEASESGERMSFEVERCAUSECAUSEDSPREADVIRUSESINFECTIONVIRUSMICROORGANISMSPERSONINFECTIOUSCOMMONCAUSINGSMALLPOXBODYINFECTIONSCERTAINFIELDMAGNETICMAGNETWIRENEEDLECURRENTCOILPOLESIRONCOMPASSLINESCOREELECTRICDIRECTIONFORCEMAGNETSBEMAGNETISMPOLEINDUCEDSCIENCESTUDYSCIENTISTSSCIENTIFICKNOWLEDGEWORKRESEARCHCHEMISTRYTECHNOLOGYMANYMATHEMATICSBIOLOGYFIELDPHYSICSLABORATORYSTUDIESWORLDSCIENTISTSTUDYINGSCIENCESBALLGAMETEAMFOOTBALLBASEBALLPLAYERSPLAYFIELDPLAYERBASKETBALLCOACHPLAYEDPLAYINGHITTENNISTEAMSGAMESSPORTSBATTERRYJOBWORKJOBSCAREEREXPERIENCEEMPLOYMENTOPPORTUNITIESWORKINGTRAININGSKILLSCAREERSPOSITIONSFINDPOSITIONFIELDOCCUPATIONSREQUIREOPPORTUNITYEARNABLEExample topicsinduced from a large collection of text[Tennenbaum et al]Significance:• Learned topics reveal hidden, implicit semantic categories in the corpus• In many cases, we can represent documents with 102topics instead of 105words• Especially important for short documents (e.g., emails). Topics overlap when words don’t !Can we analyze roles and relationships between people by analyzing email word or topic distributions?Author-Recipient-Topic model for EmailLatent Dirichlet Allocation(LDA)[Blei, Ng, Jordan, 2003]Author-Recipient Topic(ART)[McCallum, Corrada, Wang, 2004]Enron Email Corpus• 250k email messages• 23k peopleDate: Wed, 11 Apr 2001 06:56:00 -0700 (PDT)From: [email protected]: [email protected]: Enron/TransAltaContract dated Jan 1, 2001Please see below. Katalin Kiss of TransAlta has requested an electronic copy of our final draft? Are you OK with this? If so, the only version I have is the original draft without revisions.DPDebra PerlingiereEnron North America Corp.Legal Department1400 Smith Street, EB 3885Houston, Texas [email protected], and prominent sender/receiversdiscovered by ARTTop words within topic : Top author-recipients exhibiting this topic[McCallum et al, 2004]Topics, and prominent sender/receiversdiscovered by ARTBeck = “Chief Operations Officer”Dasovich = “Government Relations Executive”Shapiro = “Vice Presidence of Regulatory Affairs”Steffes = “Vice President of Government Affairs”Discovering Role Similarityconnection strength (A,B) = Traditional SNASimilarity inrecipients they sent email toSimilarity in authored topics, conditioned on recipientARTCo-Training for Semi-supervised document classificationIdea: take advantage of *redundancy*Redundantly Sufficient FeaturesProfessor Faloutsosmy advisorRedundantly Sufficient FeaturesProfessor Faloutsosmy advisorRedundantly Sufficient FeaturesRedundantly Sufficient FeaturesProfessor Faloutsosmy advisorCo-TrainingAnswer1Classifier1Answer2Classifier2Key idea: Classifier1and ClassifierJmust:1. Correctly classify labeled examples2. Agree on classification of unlabeledCoTraining Algorithm #1 [Blum&Mitchell, 1998]Given: labeled data L, unlabeled data
View Full Document