Talk:Hermitian adjoint

Latest comment: 5 years ago by 62.80.108.37 in topic Vector example

Section title

edit

Shouldn't one use inner product notation here rather than bra-ket notation?

I.e.   rather than  .

PJ.de.Bruin 00:49, 5 Jul 2004 (UTC)

at least one concrete example would be nice! otherwise it seems A*=A^{-1} as in A*A=AA*=I... anyone?

Hermiticity & Self-Adjointness: Distinction

edit

The distinction is not made clear.   is self-adjoint if:

 

Hermiticity does not necessarily guarantee the latter statement.

Dirc 20:52, 14 February 2007 (UTC)Reply

So doesn't quite make sense to say self-adjointness is not guaranteed by Hermiticity. on the other hand, being symmetric does not imply an operator is self-adjoint/Hermitian in general. Mct mht 23:48, 14 February 2007 (UTC)Reply
See redux

Proof?

edit

Hmm, the article says:

Moreover,  

I don't really see how this follows from the properties above it. Can someone provide a simple proof for this?

Never mind, think I found it. How dumb can you be?
  is Hermitian, as one can easily show. Therefore
 

Uses?

edit

This page explains what adjoints are, but just reading this page it isn't clear why it might be useful to define "adjoint". —Ben FrantzDale 01:18, 9 April 2007 (UTC) Sometimes a solution to a problem involving an adjoint converts into a solution to the original problem as for example in the case of some second order differential equations.Reply

I think the intro already gives some useful info:

Adjoints of operators generalize conjugate transposes of square matrices to (possibly) infinite-dimensional situations. If one thinks of operators on a Hilbert space as "generalized complex numbers", then the adjoint of an operator plays the role of the complex conjugate of a complex number.

If this is still too vague, perhaps we could add a line about switching over linear operators from vector spaces to their dual (like with vectors kets to bras), and how about one would like to write down an inner product between, say v' and w, where v = A x for some linear operator A? --CompuChip 14:04, 9 April 2007 (UTC)Reply
How about adding a simple concrete nontrivial example, or two? 140.109.169.94 14:28, 23 October 2007 (UTC)Reply

Adjoint of a bounded linear operator between normed spaces

edit

Neither I easily found a link from this article to adjoint of general lin. bound. operator nor is it coevered in this article. —Preceding unsigned comment added by 131.111.8.103 (talk) 16:46, 14 February 2008 (UTC)Reply

Proof of existence of adjoint operator

edit

The article states that one can prove the existence of the adjoint operator using the Riesz representation theorem for the dual of Hilbert spaces. While this is certainly true it glosses over a nice, short, instructive proof which I'd like to add to the article. If there are no objections I'll add a section that contains a proof of the existence of adjoint operators using sesquilinear forms, as in the book by Kreyszig. Compsonheir (talk) 20:35, 27 April 2009 (UTC)Reply

Do so, I'm interested by that proof. Also I will change the first line: we are implicitely talking about bounded operator. In case of unbounded one has to say something about the domain. — Preceding unsigned comment added by Noix07 (talkcontribs) 15:14, 6 December 2013 (UTC)Reply

Changed A and A* to T and T* as functions rather than matrices

edit

Notation was A, A* in matrix notation. I changed them to linear maps, so it is like T*(x) T(x) etc. —Preceding unsigned comment added by Negi(afk) (talkcontribs) 06:29, 28 April 2009 (UTC)Reply

I think you are wrong. Notation A is standard in Hilbert space operators (although T is perfectly admissible) and I believe your change is useless. I believe I saw more often (say 60% to 40%) the sentence that   is a positive operator on the Hilbert space, compared to   Also it is quite standard to write   without parentheses for the action of an operator on a vector  . And the rest of the article goes on with A. It would be easier to revert your change rather than changing all other As. --Bdmy (talk) 09:28, 28 April 2009 (UTC)Reply

Merge

edit

this article is same with Conjugate transpose. :)Ladsgroupبحث 11:57, 30 January 2011 (UTC)Reply


No, this article is not the same as Conjugate transpose. A matrix is a matrix, and an operator is an operator. Of course there is a relation between those objects (and indeed an isomorphism between the spaces conataining them), but both articles should definitely remain separated in my opinion --Vilietha (talk) 16:32, 11 March 2011 (UTC).Reply

I agree. I encounter discussions of operators in quantum mechanics that go back and forth between mention of operators, as such, and matrix representations. BUT operators can be considered quite apart from the matrix representations, and I take for granted manipulations of other representations of the operators, and extenstions that cannot be handled so easily by matrices. And I am opposed to the appalling proliferation of articles on interrelated topics and am in favour of many mergers elsewhere. Michael P. Barnett (talk) 03:54, 1 May 2011 (UTC)Reply

No merge. In order to do this don't we need to "broaden" the definiton of a matrix to include infinite dimensional matrices? Even then this only allows us to consider spaces isomorphic to L2 and the matrix representation only converges for bounded operators(right? Maybe my memory is faulty). So I don't agree with the merge. In fact, I don't recall ever reading anything that refered to the adjoint of an operator on a Hilbert Space as the Conjugate Transpose. Anyway keeping the articles seperate helps to keep a distinction between linear algebra and operator theory. There is enough confusion between the two among undergraduate math students as it is.--129.69.206.184 (talk) 12:56, 22 December 2011 (UTC)Reply

Hermiticity != self-adjoitness redux

edit

Please could someone provide an accessible reference to clarify the statement posted above that "Hermiticity and self-adjoitness are synonymous. (the notation A* = A implicitly implies Dom(A) = Dom(A*), usually.)" In particular, could someone please explain the usage of "usually" in this context. Clear specification of the nature of objects represented by symbols and the circumstances needed for a statement containing these to be true in mathematical discourse is not an unreasonable request. Michael P. Barnett (talk) 04:05, 1 May 2011 (UTC)Reply

Sometimes when people write A=A* they only mean that the equality is on dom(A). Note that For a Hermitian operator Dom(A) is always a subset of Dom(A*). But inclusion in the other direction is not necessary. In my understanding of the subject, this implies that the terms are not synonymous, BUT: under relatively tame requirements, a continuation of a Hermitian operator exists which is self-adjoint, the domain of this continuation is contained in the domain of the adjoint, and most importantly it is equal to A* everywhere it is defined. This essentially reduces the question most of the time to a question of what the appropriate domain is. With the common convention of considering continuations of operators to be the same operator(Although this is formally not the case), we have roughly the synonimity in the statement. I don't have a reference, but a necessary and sufficient condition for the self-adjoint continuation to exist is that the defect indices of the quasiregular points of the operator must be equal to each other. This is clearly not always the case (Which is what the word "usally" probably means in this context)--129.69.21.121 (talk) 12:18, 22 December 2011 (UTC)Reply
See extensions of symmetric operators and http://www.encyclopediaofmath.org/index.php/Symmetric_operator for appropriate definitions and clarifying statements. 178.38.97.101 (talk) 22:42, 17 May 2015 (UTC)Reply

Orthogonal complement

edit

Can someone please either change the   symbol or else define it? With the prevalence of ambiguous notation concerning conjugates and adjoints etc ( A, A*, A,...), it is rather unclear what is meant by it, to the casual reader. In particular, is it related to the raised version  93.96.22.178 (talk) 17:14, 15 July 2013 (UTC)ErikpanReply

  Done. Incnis Mrsi (talk) 18:12, 15 July 2013 (UTC)Reply

Real or complex?

edit

Is the terminology "self-adjoint" only defined for complex Hilbert spaces? The article is written this way. But I think it is equally valid for real Hilbert spaces.

178.38.97.101 (talk) 22:35, 17 May 2015 (UTC)Reply

Operator from H to H or from H1 to H2???

edit

Another very strange thing about this article: isn't the adjoint also defined when the two spaces are distinct? That is, A* can be defined for a linear operator A:H1H2.

This holds in both finite and infinite dimensions, and for both bounded and unbounded operators. In the case of unbounded operators, A must be densely defined for A* to exist. See unbounded operator.

This seems like a serious omission. How can I flag an article for having a too-narrow scope? 178.38.97.101 (talk) 23:17, 17 May 2015 (UTC)Reply

I will try to draft a generalization for operators between Banach spaces and distinct Hilbert spaces tomorrow. Valiantrider (talk) 21:11, 26 December 2015 (UTC)Reply

Hermitian vs self-adjoint vs symmetric

edit

It is somewhat different for operators (linear transformations) than for matrices and bilinear/sesquilinear forms). That is perhaps why it is confusing.

For operators:

Symmetric operator: <.,.> is bilinear or sesquilinear and <Ax,y>=<x,Ay> for x,y on dom(A).
Self-adjoint operator: A is symmetric, A is densely defined, and dom (A) = dom (A^*). Equivalently: A is densely defined, A=A^* on dom(A) and dom (A) = dom (A^*). Equivalently, this is also written simply as A=A^* and this is taken to include the statement about A being densely defined (so that A^* exists) and the domains being equal.
Note that if A is symmetric and A is densely defined, then dom(A) ⊆ dom(A^*) always.
So for operators (in contrast to matrices!), the distinction between symmetric and self-adjoint has to do with the domains of the operators, not the use of the conjugate. It is an analytic difference, not an algebraic one.
In particular, the concepts of "symmetric operator" and "self-adjoint operator" are equally valid, and have formally the same definition, both for real and complex vector spaces.
Hermitian operator: I am not sure if this is a symmetric operator or a self-adjoint operator. (This was the question asked above.) But there is one difference: as soon as you say Hermitian, you automatically imply that the Hilbert space is complex.


By way of comparison, the use of symmetric is different for matrices and for bilinear/sequilinear forms than it is for operators.

The adjoint of a matrix is its transpose conjugate.
A matrix is symmetric if it is equal to its own transpose. It is self-adjoint, or Hermitian, if it is equal to its own adjoint.
So for real matrices, self-adjoint and symmetric are the same. For complex matrices, they are different.

For bilinear/sequilinear forms:

Symmetric bilinear form: B is bilinear B(x,y) = B(y,x). Defined for both real and complex vector spaces and is given by a symmetric matrix.
Hermitian-symmetric form (leading to Hermitian inner product): B is sesquilinear and B(x,y) = bar (B(y,x)). Defined for complex vector spaces and given by a self-adjoint (or Hermitian) matrix.

178.38.97.101 (talk) 23:56, 17 May 2015 (UTC)Reply

New content: adjoint for normed spaces

edit

I included the definition of adjoint operators for operators between normed spaces from Brezis' book on functional analysis. Please check, I'm new to this. I will probably include the case where both spaces are Hilbert spaces as an example tomorrow. Also there is need for some rewriting of the intro (to change the scope from Hilbert to normed spaces) and maybe some section titles. Valiantrider (talk) 21:58, 26 December 2015 (UTC)Reply

I included an informal definition to ease into the notion of adjoint operators and mentioned the "mixed" case (operator from Hilbert to Banach), which is especially interesting when one considers for example A as the inclusion operator from some Hilbert space which is a proper subset of a Banach space (as in Cameron-Martin spaces). I got most of the input from Brezis' book. Will probably include some examples soon. Valiantrider (talk) 21:46, 27 December 2015 (UTC)Reply

Vector example

edit

Since this is all about inner products, wouldn't it be possible to start the "information definition" section with a simple example involving finite-dimensional vectors and matrices? — Preceding unsigned comment added by 62.80.108.37 (talk) 16:58, 6 February 2019 (UTC)Reply