Figure 2 shown for neural vectors in (Turian et al., 2010). 4.4 Model Analysis: Vector Length and Context Size In Fig. 2, we show the results of experiments that vary vector length and context window. A context window that extends to the left and right of a tar- get word will be called symmetric, and one which extends only to the left will be called asymmet- ric. In (a), we observe diminishing returns for vec- tors larger than about 200 dimensions. In (b) and (c), we examine the effect of varying the window size for symmetric and asymmetric context win- dows. Performance is better on the syntactic sub- task for small and asymmetric context windows, which aligns with the intuition that syntactic infor- mation is mostly drawn from the immediate con- text and can depend strongly on word order. Se- mantic information, on the other hand, is more fre- quently non-local, and more of it is captured with larger window sizes. In Fig. 2, we show the results of experiments that