Unformatted text preview:

Data Hiding in Image and Video Part I: Fundamental Issues and SolutionsIntroductionSlide 3Data Hiding FrameworkKey Elements in Data Hiding SystemTwo Basic Embedded MechanismsSlide 7Comparison for Type I & IISlide 9Slide 10Slide 11Slide 12Slide 13Quantified Capacity StudySlide 15Capacity Comparison for Type I&IISlide 17Multi-level EmbeddingSlide 19Handling Uneven Embedding CapacitySlide 21Slide 22Slide 23Backup EmbeddingQuick QuestionsShufflingSlide 27Slide 28Slide 29Slide 30Practical ConsiderationVariable Embedding Rate (VER)Slide 33Thank youMTFData Hiding in Image and Data Hiding in Image and VideoVideoPart I: Fundamental Part I: Fundamental Issues and SolutionsIssues and SolutionsECE 738 Class PresentationECE 738 Class PresentationBy Tanaphol ThaipanichBy Tanaphol [email protected]@wisc.eduIntroductionIntroductionData hiding/Digital watermark = Data hiding/Digital watermark = schemes to embed secondary data in schemes to embed secondary data in digital mediadigital mediaApplication: Ownership Protection, Application: Ownership Protection, Access control, AuthenticationAccess control, AuthenticationData hiding = Communication Data hiding = Communication problemproblemEmbedded Data = Signal to be Embedded Data = Signal to be transmittedtransmittedIntroductionIntroductionEmbedding Capacity Vs RobustnessEmbedding Capacity Vs RobustnessDistortionDistortion – imperceptibly small for – imperceptibly small for commercial or artistic reasoncommercial or artistic reasonActual Noise ConditionsActual Noise ConditionsOverestimate – waste capacityOverestimate – waste capacityUnderestimate – corruption of embedded bitsUnderestimate – corruption of embedded bitsUneven distribution of embedding capacityUneven distribution of embedding capacity# of embeddable bit varies from location to # of embeddable bit varies from location to locationlocationData Hiding FrameworkData Hiding FrameworkKey Elements in Data Key Elements in Data Hiding SystemHiding SystemUpper LayersBuild on top to obtain additional functionalitiesThree key elementsThree key elements (1)(1)Mechanism for embedding one bitMechanism for embedding one bit(2)(2)Perceptual model Perceptual model  imperceptibility imperceptibility(3)(3)Modulation/Multiplexing techniques Modulation/Multiplexing techniques  multiples bits.multiples bits.Two Basic Embedded Two Basic Embedded Mechanisms Mechanisms Type – I: Additive EmbeddingType – I: Additive EmbeddingAdding secondary data to host signal Adding secondary data to host signal  II11 – I – I00 = f(b) = f(b)II0 0 = Major noise source= Major noise sourceKnowledge of IKnowledge of I0 0 will enhance detection will enhance detection performanceperformanceTwo Basic Embedded Two Basic Embedded Mechanisms Mechanisms Type – II: Relationship Enforcement Type – II: Relationship Enforcement EmbeddingEmbeddingDeterministic enforcing relationship Deterministic enforcing relationship  b b = g(I= g(I11))Minimize perceptual distortion Minimize perceptual distortion  I I1 1 closes to Icloses to I00Doesn’t need a knowledge of IDoesn’t need a knowledge of I00  Information about b carried in IInformation about b carried in I11Comparison for Type I & Comparison for Type I & IIIICapacity Vs Robustness under “Blind Capacity Vs Robustness under “Blind Detection”Detection”Simplified Additive Model (Type I)Simplified Additive Model (Type I)For simplicity, MFor simplicity, Mi i = i.i.d N(0 , = i.i.d N(0 , σσMM22))Optimal detector (Min prob. Of error)Optimal detector (Min prob. Of error)Detection Statistic Detection Statistic  Normalized correlation Normalized correlationComparison for Type I & Comparison for Type I & IIIITTN N is Gaussian dist with unit is Gaussian dist with unit variance and this following meanvariance and this following mean Minimize probability of errorMinimize probability of errorRaise the ratio of total watermark Raise the ratio of total watermark energy to noise powerenergy to noise powerWhat should we do?What should we do?Comparison for Type I & Comparison for Type I & IIIIGiven the same noise powerGiven the same noise powerA watermark with higher power A watermark with higher power  distortion distortionUsing longer signal Using longer signal  lower embedding lower embedding capacitycapacityComparison for Type I & Comparison for Type I & IIIIType II – no interference from host media Type II – no interference from host media & coding one bit in small number of host & coding one bit in small number of host components components  High capacity High capacityOdd – Even embeddingOdd – Even embeddingComparison for Type I & Comparison for Type I & IIIIRobustness comes from quantization Robustness comes from quantization or tolerance zoneor tolerance zoneLarge Q = more tolerance (-Q/2 , Q/2)Large Q = more tolerance (-Q/2 , Q/2)Assume host components within +/- Assume host components within +/- Q of kQ to be uniform distribution Q of kQ to be uniform distribution  MSE = 1/3*QMSE = 1/3*Q22Large Q = larger distortionLarge Q = larger distortionComparison for Type I & Comparison for Type I & IIIIType IType I:: Excellent robustness and invisibility Excellent robustness and invisibility when original host is available. For blind when original host is available. For blind detection, using longer watermark to present 1 detection, using longer watermark to present 1 bit = More robust but less capacitybit = More robust but less capacityType IIType II:: High data-rate data hiding High data-rate data hiding applications that do not have to survive noiseapplications that do not have to survive noiseQuantified Capacity Quantified Capacity StudyStudyType I embeddingType I embeddingChannel model = CICOChannel model = CICOAdditive noise Additive noise  Host interference & Host interference & Processing Noise – i.i.d Gaussian distProcessing Noise – i.i.d Gaussian distShannon channel capacity Shannon channel capacity  C = W log C = W log2 2 (1+S/N)(1+S/N)AA2 2 = power of embedded signal= power of embedded signalσσII22 = power of host signal = power of host signalσσ2 2 = power of processing noise ( = power of processing noise ( σσII22 >> >> σσ2 2 ))W = ½ (MTF50)W = ½ (MTF50)Quantified Capacity Quantified


View Full Document

UW-Madison ECE 738 - Data Hiding in Image and Video

Download Data Hiding in Image and Video
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Data Hiding in Image and Video and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Data Hiding in Image and Video 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?