The simplest case of the problem I'm thinking about involves an elliptic differential operator, Lu=−u″+qu, on the interval (0,1), with homogeneous Dirichlet boundary conditions. I want to show that the bilinear form on H10subsetH1 defined by
a(u,v)=int10u′v′+quv dx
is bounded for the H1-norm, i.e., |a(u,v)|leqM|u|1|v|1 for some constant M>0.
My question: can I assume that the linear coefficient q is L1 or even L2 and still guarantee boundedness?
I was thinking that this is possible, but the only books that I have lying around discussing this consider only the case when q is smooth or Linfty. I've played around with the Cauchy-Schwartz inequality for the term intquv but am not getting anywhere.
No comments:
Post a Comment