DOC PREVIEW
UTD CS 4398 - Application Forens

This preview shows page 1-2-21-22 out of 22 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Digital ForensicsOutlineEmail ForensicsEmail InvestigationsClient/Server RolesEmail Crimes and ViolationsEmail ServersEmail Forensics ToolsWorm Detection: IntroductionEmail Worm Detection using Data MiningAssumptionsFeature setsData Mining ApproachData setOur Implementation and AnalysisMobile Device/System ForensicsMobile Device Forensics OverviewMobile PhonesAcquisition proceduresMobile Forensics ToolsPapers to discuss: November 10, 2008Papers to discuss November 17, 2008Digital ForensicsDr. Bhavani ThuraisinghamThe University of Texas at DallasApplication ForensicsNovember 5, 2008OutlineEmail ForensicsUTD work on Email worm detection - revisitedMobile System ForensicsNote: Other Application/systems related forensics-Database forensics, Network forensics (already discussed)Papers to discuss November 10, 2008 and November 17, 2008Reference: Chapters 12 and 13 of text bookOptional paper to read:-http://www.mindswap.org/papers/Trust.pdfEmail ForensicsEmail InvestigationsClient/Server rolesEmail crimes and violationsEmail serversEmail forensics toolsEmail InvestigationsTypes of email investigations-Emails have worms and viruses – suspicious emails-Checking emails in a crime – homicideTypes of suspicious emails-Phishing emails i- they are in HTML format and redirect to suspicious web sites-Nigerian scam-Spoofing emailsClient/Server RolesClient-Server architectureEmail servers runs the email server programs – example Microsoft Exchange ServerEmail runs the client program – example OutlookIdentitication/authntictaion is used for client to access the serverIntranet/Internet email servers-Intranet – local environment-Internet – public: example: yahoo, hotmail etc.Email Crimes and ViolationsGoal is to determine who is behind the crime such as who sent the emailSteps to email forensics-Examine email message-Copy email message – also forward email -View and examine email header: tools available for outlook and other email clients-Examine additional files such as address books-Trace the message using various Internet tools-Examine network logs (netflow analysis)Note: UTD Netflow tools SCRUB are in SourceForgeEmail ServersNeed to work with the network administrator on how to retrieve messages from the serverUnderstand how the server records and handles the messagesHow are the email logs created and storedHow are deleted email messages handled by the server? Are copies of the messages still kept?Chapter 12 discussed email servers by UNIX, Microsoft, NovellEmail Forensics ToolsSeveral tools for Outlook Express, Eudora Exchange, Lotus notesTools for log analysis, recovering deleted emails,Examples:-AccessData FTK-FINALeMAIL-EDBXtract-MailRecoveryWorm Detection: IntroductionWhat are worms?-Self-replicating program; Exploits software vulnerability on a victim; Remotely infects other victimsEvil worms-Severe effect; Code Red epidemic cost $2.6 BillionGoals of worm detection-Real-time detectionIssues-Substantial Volume of Identical Traffic, Random ProbingMethods for worm detection-Count number of sources/destinations; Count number of failed connection attemptsWorm Types-Email worms, Instant Messaging worms, Internet worms, IRC worms, File-sharing Networks wormsAutomatic signature generation possible -EarlyBird System (S. Singh -UCSD); Autograph (H. Ah-Kim - CMU)Email Worm Detection using Data MiningTraining dataFeature extractionClean or Infected ?Outgoing EmailsClassifierMachine LearningTest dataThe ModelTask: given some training instances of both “normal” and “viral” emails, induce a hypothesis to detect “viral” emails.We used:Naïve BayesSVMAssumptionsFeatures are based on outgoing emails.Different users have different “normal” behaviour.Analysis should be per-user basis.Two groups of features -Per email (#of attachments, HTML in body, text/binary attachments)-Per window (mean words in body, variable words in subject)Total of 24 features identifiedGoal: Identify “normal” and “viral” emails based on these featuresFeature sets-Per email featuresBinary valued FeaturesPresence of HTML; script tags/attributes; embedded images; hyperlinks; Presence of binary, text attachments; MIME types of file attachmentsContinuous-valued FeaturesNumber of attachments; Number of words/characters in the subject and body-Per window featuresNumber of emails sent; Number of unique email recipients; Number of unique sender addresses; Average number of words/characters per subject, body; average word length:; Variance in number of words/characters per subject, body; Variance in word lengthRatio of emails with attachmentsData Mining ApproachClassifierSVM Naïve Bayesinfected?Clean?CleanClean/ InfectedClean/ InfectedTest instanceTest instanceData setCollected from UC Berkeley.-Contains instances for both normal and viral emails.Six worm types: -bagle.f, bubbleboy, mydoom.m, -mydoom.u, netsky.d, sobig.fOriginally Six sets of data:-training instances: normal (400) + five worms (5x200) -testing instances: normal (1200) + the sixth worm (200)Problem: Not balanced, no cross validation reportedSolution: re-arrange the data and apply cross-validationOur Implementation and AnalysisImplementation-Naïve Bayes: Assume “Normal” distribution of numeric and real data; smoothing applied-SVM: with the parameter settings: one-class SVM with the radial basis function using “gamma” = 0.015 and “nu” = 0.1.Analysis-NB alone performs better than other techniques-SVM alone also performs better if parameters are set correctly-mydoom.m and VBS.Bubbleboy data set are not sufficient (very low detection accuracy in all classifiers)-The feature-based approach seems to be useful only when we haveidentified the relevant featuresgathered enough training dataImplement classifiers with best parameter settingsMobile Device/System ForensicsMobile device forensics overviewAcquisition proceduresSummaryMobile Device Forensics OverviewWhat is stored in cell phones-Incoming/outgoing/missed calls-Text messages-Short messages-Instant messaging logs-Web pages-Pictures-Calendars-Address books-Music files-Voice recordsMobile PhonesMultiple generations-Analog, Digital personal communications, Third generations (increased bandwidth and other features)Digital networks-CDMA, GSM, TDMA, - - -Proprietary OSsSIM Cards (Subscriber Identity


View Full Document

UTD CS 4398 - Application Forens

Documents in this Course
Botnets

Botnets

33 pages

Botnets

Botnets

33 pages

Load more
Download Application Forens
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Application Forens and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Application Forens 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?