mirror of
https://github.com/rust-lang/rust.git
synced 2026-04-27 18:57:42 +03:00
Auto merge of #64946 - Centril:rollup-66mj5o0, r=Centril
Rollup of 10 pull requests Successful merges: - #63674 (syntax: Support modern attribute syntax in the `meta` matcher) - #63931 (Stabilize macros in some more positions) - #64887 (syntax: recover trailing `|` in or-patterns) - #64895 (async/await: improve not-send errors) - #64896 (Remove legacy grammar) - #64907 (A small amount of tidying-up factored out from PR #64648) - #64928 (Add tests for some issues) - #64930 (Silence unreachable code lint from await desugaring) - #64935 (Improve code clarity) - #64937 (Deduplicate closure type errors) Failed merges: r? @ghost
This commit is contained in:
+4
-809
@@ -1,812 +1,7 @@
|
||||
% Grammar
|
||||
|
||||
# Introduction
|
||||
The Rust grammar may now be found in the [reference]. Additionally, the [grammar
|
||||
working group] is working on producing a testable grammar.
|
||||
|
||||
This document is the primary reference for the Rust programming language grammar. It
|
||||
provides only one kind of material:
|
||||
|
||||
- Chapters that formally define the language grammar.
|
||||
|
||||
This document does not serve as an introduction to the language. Background
|
||||
familiarity with the language is assumed. A separate [guide] is available to
|
||||
help acquire such background.
|
||||
|
||||
This document also does not serve as a reference to the [standard] library
|
||||
included in the language distribution. Those libraries are documented
|
||||
separately by extracting documentation attributes from their source code. Many
|
||||
of the features that one might expect to be language features are library
|
||||
features in Rust, so what you're looking for may be there, not here.
|
||||
|
||||
[guide]: guide.html
|
||||
[standard]: std/index.html
|
||||
|
||||
# Notation
|
||||
|
||||
Rust's grammar is defined over Unicode codepoints, each conventionally denoted
|
||||
`U+XXXX`, for 4 or more hexadecimal digits `X`. _Most_ of Rust's grammar is
|
||||
confined to the ASCII range of Unicode, and is described in this document by a
|
||||
dialect of Extended Backus-Naur Form (EBNF), specifically a dialect of EBNF
|
||||
supported by common automated LL(k) parsing tools such as `llgen`, rather than
|
||||
the dialect given in ISO 14977. The dialect can be defined self-referentially
|
||||
as follows:
|
||||
|
||||
```antlr
|
||||
grammar : rule + ;
|
||||
rule : nonterminal ':' productionrule ';' ;
|
||||
productionrule : production [ '|' production ] * ;
|
||||
production : term * ;
|
||||
term : element repeats ;
|
||||
element : LITERAL | IDENTIFIER | '[' productionrule ']' ;
|
||||
repeats : [ '*' | '+' ] NUMBER ? | NUMBER ? | '?' ;
|
||||
```
|
||||
|
||||
Where:
|
||||
|
||||
- Whitespace in the grammar is ignored.
|
||||
- Square brackets are used to group rules.
|
||||
- `LITERAL` is a single printable ASCII character, or an escaped hexadecimal
|
||||
ASCII code of the form `\xQQ`, in single quotes, denoting the corresponding
|
||||
Unicode codepoint `U+00QQ`.
|
||||
- `IDENTIFIER` is a nonempty string of ASCII letters and underscores.
|
||||
- The `repeat` forms apply to the adjacent `element`, and are as follows:
|
||||
- `?` means zero or one repetition
|
||||
- `*` means zero or more repetitions
|
||||
- `+` means one or more repetitions
|
||||
- NUMBER trailing a repeat symbol gives a maximum repetition count
|
||||
- NUMBER on its own gives an exact repetition count
|
||||
|
||||
This EBNF dialect should hopefully be familiar to many readers.
|
||||
|
||||
## Unicode productions
|
||||
|
||||
A few productions in Rust's grammar permit Unicode codepoints outside the ASCII
|
||||
range. We define these productions in terms of character properties specified
|
||||
in the Unicode standard, rather than in terms of ASCII-range codepoints. The
|
||||
section [Special Unicode Productions](#special-unicode-productions) lists these
|
||||
productions.
|
||||
|
||||
## String table productions
|
||||
|
||||
Some rules in the grammar — notably [unary
|
||||
operators](#unary-operator-expressions), [binary
|
||||
operators](#binary-operator-expressions), and [keywords](#keywords) — are
|
||||
given in a simplified form: as a listing of a table of unquoted, printable
|
||||
whitespace-separated strings. These cases form a subset of the rules regarding
|
||||
the [token](#tokens) rule, and are assumed to be the result of a
|
||||
lexical-analysis phase feeding the parser, driven by a DFA, operating over the
|
||||
disjunction of all such string table entries.
|
||||
|
||||
When such a string enclosed in double-quotes (`"`) occurs inside the grammar,
|
||||
it is an implicit reference to a single member of such a string table
|
||||
production. See [tokens](#tokens) for more information.
|
||||
|
||||
# Lexical structure
|
||||
|
||||
## Input format
|
||||
|
||||
Rust input is interpreted as a sequence of Unicode codepoints encoded in UTF-8.
|
||||
Most Rust grammar rules are defined in terms of printable ASCII-range
|
||||
codepoints, but a small number are defined in terms of Unicode properties or
|
||||
explicit codepoint lists. [^inputformat]
|
||||
|
||||
[^inputformat]: Substitute definitions for the special Unicode productions are
|
||||
provided to the grammar verifier, restricted to ASCII range, when verifying the
|
||||
grammar in this document.
|
||||
|
||||
## Special Unicode Productions
|
||||
|
||||
The following productions in the Rust grammar are defined in terms of Unicode
|
||||
properties: `ident`, `non_null`, `non_eol`, `non_single_quote` and
|
||||
`non_double_quote`.
|
||||
|
||||
### Identifiers
|
||||
|
||||
The `ident` production is any nonempty Unicode string of
|
||||
the following form:
|
||||
|
||||
- The first character is in one of the following ranges `U+0041` to `U+005A`
|
||||
("A" to "Z"), `U+0061` to `U+007A` ("a" to "z"), or `U+005F` ("\_").
|
||||
- The remaining characters are in the range `U+0030` to `U+0039` ("0" to "9"),
|
||||
or any of the prior valid initial characters.
|
||||
|
||||
as long as the identifier does _not_ occur in the set of [keywords](#keywords).
|
||||
|
||||
### Delimiter-restricted productions
|
||||
|
||||
Some productions are defined by exclusion of particular Unicode characters:
|
||||
|
||||
- `non_null` is any single Unicode character aside from `U+0000` (null)
|
||||
- `non_eol` is any single Unicode character aside from `U+000A` (`'\n'`)
|
||||
- `non_single_quote` is any single Unicode character aside from `U+0027` (`'`)
|
||||
- `non_double_quote` is any single Unicode character aside from `U+0022` (`"`)
|
||||
|
||||
## Comments
|
||||
|
||||
```antlr
|
||||
comment : block_comment | line_comment ;
|
||||
block_comment : "/*" block_comment_body * "*/" ;
|
||||
block_comment_body : [block_comment | character] * ;
|
||||
line_comment : "//" non_eol * ;
|
||||
```
|
||||
|
||||
**FIXME:** add doc grammar?
|
||||
|
||||
## Whitespace
|
||||
|
||||
```antlr
|
||||
whitespace_char : '\x20' | '\x09' | '\x0a' | '\x0d' ;
|
||||
whitespace : [ whitespace_char | comment ] + ;
|
||||
```
|
||||
|
||||
## Tokens
|
||||
|
||||
```antlr
|
||||
simple_token : keyword | unop | binop ;
|
||||
token : simple_token | ident | literal | symbol | whitespace token ;
|
||||
```
|
||||
|
||||
### Keywords
|
||||
|
||||
<p id="keyword-table-marker"></p>
|
||||
|
||||
| | | | | |
|
||||
|----------|----------|----------|----------|----------|
|
||||
| _ | abstract | alignof | as | become |
|
||||
| box | break | const | continue | crate |
|
||||
| do | else | enum | extern | false |
|
||||
| final | fn | for | if | impl |
|
||||
| in | let | loop | macro | match |
|
||||
| mod | move | mut | offsetof | override |
|
||||
| priv | proc | pub | pure | ref |
|
||||
| return | Self | self | sizeof | static |
|
||||
| struct | super | trait | true | type |
|
||||
| typeof | unsafe | unsized | use | virtual |
|
||||
| where | while | yield | | |
|
||||
|
||||
|
||||
Each of these keywords has special meaning in its grammar, and all of them are
|
||||
excluded from the `ident` rule.
|
||||
|
||||
Not all of these keywords are used by the language. Some of them were used
|
||||
before Rust 1.0, and were left reserved once their implementations were
|
||||
removed. Some of them were reserved before 1.0 to make space for possible
|
||||
future features.
|
||||
|
||||
### Literals
|
||||
|
||||
```antlr
|
||||
lit_suffix : ident;
|
||||
literal : [ string_lit | char_lit | byte_string_lit | byte_lit | num_lit | bool_lit ] lit_suffix ?;
|
||||
```
|
||||
|
||||
The optional `lit_suffix` production is only used for certain numeric literals,
|
||||
but is reserved for future extension. That is, the above gives the lexical
|
||||
grammar, but a Rust parser will reject everything but the 12 special cases
|
||||
mentioned in [Number literals](reference/tokens.html#number-literals) in the
|
||||
reference.
|
||||
|
||||
#### Character and string literals
|
||||
|
||||
```antlr
|
||||
char_lit : '\x27' char_body '\x27' ;
|
||||
string_lit : '"' string_body * '"' | 'r' raw_string ;
|
||||
|
||||
char_body : non_single_quote
|
||||
| '\x5c' [ '\x27' | common_escape | unicode_escape ] ;
|
||||
|
||||
string_body : non_double_quote
|
||||
| '\x5c' [ '\x22' | common_escape | unicode_escape ] ;
|
||||
raw_string : '"' raw_string_body '"' | '#' raw_string '#' ;
|
||||
|
||||
common_escape : '\x5c'
|
||||
| 'n' | 'r' | 't' | '0'
|
||||
| 'x' hex_digit 2
|
||||
unicode_escape : 'u' '{' hex_digit+ 6 '}';
|
||||
|
||||
hex_digit : 'a' | 'b' | 'c' | 'd' | 'e' | 'f'
|
||||
| 'A' | 'B' | 'C' | 'D' | 'E' | 'F'
|
||||
| dec_digit ;
|
||||
oct_digit : '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' ;
|
||||
dec_digit : '0' | nonzero_dec ;
|
||||
nonzero_dec: '1' | '2' | '3' | '4'
|
||||
| '5' | '6' | '7' | '8' | '9' ;
|
||||
```
|
||||
|
||||
#### Byte and byte string literals
|
||||
|
||||
```antlr
|
||||
byte_lit : "b\x27" byte_body '\x27' ;
|
||||
byte_string_lit : "b\x22" string_body * '\x22' | "br" raw_byte_string ;
|
||||
|
||||
byte_body : ascii_non_single_quote
|
||||
| '\x5c' [ '\x27' | common_escape ] ;
|
||||
|
||||
byte_string_body : ascii_non_double_quote
|
||||
| '\x5c' [ '\x22' | common_escape ] ;
|
||||
raw_byte_string : '"' raw_byte_string_body '"' | '#' raw_byte_string '#' ;
|
||||
|
||||
```
|
||||
|
||||
#### Number literals
|
||||
|
||||
```antlr
|
||||
num_lit : nonzero_dec [ dec_digit | '_' ] * float_suffix ?
|
||||
| '0' [ [ dec_digit | '_' ] * float_suffix ?
|
||||
| 'b' [ '1' | '0' | '_' ] +
|
||||
| 'o' [ oct_digit | '_' ] +
|
||||
| 'x' [ hex_digit | '_' ] + ] ;
|
||||
|
||||
float_suffix : [ exponent | '.' dec_lit exponent ? ] ? ;
|
||||
|
||||
exponent : ['E' | 'e'] ['-' | '+' ] ? dec_lit ;
|
||||
dec_lit : [ dec_digit | '_' ] + ;
|
||||
```
|
||||
|
||||
#### Boolean literals
|
||||
|
||||
```antlr
|
||||
bool_lit : [ "true" | "false" ] ;
|
||||
```
|
||||
|
||||
The two values of the boolean type are written `true` and `false`.
|
||||
|
||||
### Symbols
|
||||
|
||||
```antlr
|
||||
symbol : "::" | "->"
|
||||
| '#' | '[' | ']' | '(' | ')' | '{' | '}'
|
||||
| ',' | ';' ;
|
||||
```
|
||||
|
||||
Symbols are a general class of printable [tokens](#tokens) that play structural
|
||||
roles in a variety of grammar productions. They are cataloged here for
|
||||
completeness as the set of remaining miscellaneous printable tokens that do not
|
||||
otherwise appear as [unary operators](#unary-operator-expressions), [binary
|
||||
operators](#binary-operator-expressions), or [keywords](#keywords).
|
||||
|
||||
## Paths
|
||||
|
||||
```antlr
|
||||
expr_path : [ "::" ] ident [ "::" expr_path_tail ] + ;
|
||||
expr_path_tail : '<' type_expr [ ',' type_expr ] + '>'
|
||||
| expr_path ;
|
||||
|
||||
type_path : ident [ type_path_tail ] + ;
|
||||
type_path_tail : '<' type_expr [ ',' type_expr ] + '>'
|
||||
| "::" type_path ;
|
||||
```
|
||||
|
||||
# Syntax extensions
|
||||
|
||||
## Macros
|
||||
|
||||
```antlr
|
||||
expr_macro_rules : "macro_rules" '!' ident '(' macro_rule * ')' ';'
|
||||
| "macro_rules" '!' ident '{' macro_rule * '}' ;
|
||||
macro_rule : '(' matcher * ')' "=>" '(' transcriber * ')' ';' ;
|
||||
matcher : '(' matcher * ')' | '[' matcher * ']'
|
||||
| '{' matcher * '}' | '$' ident ':' ident
|
||||
| '$' '(' matcher * ')' sep_token? [ '*' | '+' ]
|
||||
| non_special_token ;
|
||||
transcriber : '(' transcriber * ')' | '[' transcriber * ']'
|
||||
| '{' transcriber * '}' | '$' ident
|
||||
| '$' '(' transcriber * ')' sep_token? [ '*' | '+' ]
|
||||
| non_special_token ;
|
||||
```
|
||||
|
||||
# Crates and source files
|
||||
|
||||
**FIXME:** grammar? What production covers #![crate_id = "foo"] ?
|
||||
|
||||
# Items and attributes
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
## Items
|
||||
|
||||
```antlr
|
||||
item : vis ? mod_item | fn_item | type_item | struct_item | enum_item
|
||||
| const_item | static_item | trait_item | impl_item | extern_block_item ;
|
||||
```
|
||||
|
||||
### Type Parameters
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Modules
|
||||
|
||||
```antlr
|
||||
mod_item : "mod" ident ( ';' | '{' mod '}' );
|
||||
mod : [ view_item | item ] * ;
|
||||
```
|
||||
|
||||
#### View items
|
||||
|
||||
```antlr
|
||||
view_item : extern_crate_decl | use_decl ';' ;
|
||||
```
|
||||
|
||||
##### Extern crate declarations
|
||||
|
||||
```antlr
|
||||
extern_crate_decl : "extern" "crate" crate_name
|
||||
crate_name: ident | ( ident "as" ident )
|
||||
```
|
||||
|
||||
##### Use declarations
|
||||
|
||||
```antlr
|
||||
use_decl : vis ? "use" [ path "as" ident
|
||||
| path_glob ] ;
|
||||
|
||||
path_glob : ident [ "::" [ path_glob
|
||||
| '*' ] ] ?
|
||||
| '{' path_item [ ',' path_item ] * '}' ;
|
||||
|
||||
path_item : ident | "self" ;
|
||||
```
|
||||
|
||||
### Functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Generic functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Unsafety
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
##### Unsafe functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
##### Unsafe blocks
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Diverging functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type definitions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Structures
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Enumerations
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Constant items
|
||||
|
||||
```antlr
|
||||
const_item : "const" ident ':' type '=' expr ';' ;
|
||||
```
|
||||
|
||||
### Static items
|
||||
|
||||
```antlr
|
||||
static_item : "static" ident ':' type '=' expr ';' ;
|
||||
```
|
||||
|
||||
#### Mutable statics
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Traits
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Implementations
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### External blocks
|
||||
|
||||
```antlr
|
||||
extern_block_item : "extern" '{' extern_block '}' ;
|
||||
extern_block : [ foreign_fn ] * ;
|
||||
```
|
||||
|
||||
## Visibility and Privacy
|
||||
|
||||
```antlr
|
||||
vis : "pub" ;
|
||||
```
|
||||
### Re-exporting and Visibility
|
||||
|
||||
See [Use declarations](#use-declarations).
|
||||
|
||||
## Attributes
|
||||
|
||||
```antlr
|
||||
attribute : '#' '!' ? '[' meta_item ']' ;
|
||||
meta_item : ident [ '=' literal
|
||||
| '(' meta_seq ')' ] ? ;
|
||||
meta_seq : meta_item [ ',' meta_seq ] ? ;
|
||||
```
|
||||
|
||||
# Statements and expressions
|
||||
|
||||
## Statements
|
||||
|
||||
```antlr
|
||||
stmt : decl_stmt | expr_stmt | ';' ;
|
||||
```
|
||||
|
||||
### Declaration statements
|
||||
|
||||
```antlr
|
||||
decl_stmt : item | let_decl ;
|
||||
```
|
||||
|
||||
#### Item declarations
|
||||
|
||||
See [Items](#items).
|
||||
|
||||
#### Variable declarations
|
||||
|
||||
```antlr
|
||||
let_decl : "let" pat [':' type ] ? [ init ] ? ';' ;
|
||||
init : [ '=' ] expr ;
|
||||
```
|
||||
|
||||
### Expression statements
|
||||
|
||||
```antlr
|
||||
expr_stmt : expr ';' ;
|
||||
```
|
||||
|
||||
## Expressions
|
||||
|
||||
```antlr
|
||||
expr : literal | path | tuple_expr | unit_expr | struct_expr
|
||||
| block_expr | method_call_expr | field_expr | array_expr
|
||||
| idx_expr | range_expr | unop_expr | binop_expr
|
||||
| paren_expr | call_expr | lambda_expr | while_expr
|
||||
| loop_expr | break_expr | continue_expr | for_expr
|
||||
| if_expr | match_expr | if_let_expr | while_let_expr
|
||||
| return_expr ;
|
||||
```
|
||||
|
||||
#### Lvalues, rvalues and temporaries
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Moved and copied types
|
||||
|
||||
**FIXME:** Do we want to capture this in the grammar as different productions?
|
||||
|
||||
### Literal expressions
|
||||
|
||||
See [Literals](#literals).
|
||||
|
||||
### Path expressions
|
||||
|
||||
See [Paths](#paths).
|
||||
|
||||
### Tuple expressions
|
||||
|
||||
```antlr
|
||||
tuple_expr : '(' [ expr [ ',' expr ] * | expr ',' ] ? ')' ;
|
||||
```
|
||||
|
||||
### Unit expressions
|
||||
|
||||
```antlr
|
||||
unit_expr : "()" ;
|
||||
```
|
||||
|
||||
### Structure expressions
|
||||
|
||||
```antlr
|
||||
struct_expr_field_init : ident | ident ':' expr ;
|
||||
struct_expr : expr_path '{' struct_expr_field_init
|
||||
[ ',' struct_expr_field_init ] *
|
||||
[ ".." expr ] '}' |
|
||||
expr_path '(' expr
|
||||
[ ',' expr ] * ')' |
|
||||
expr_path ;
|
||||
```
|
||||
|
||||
### Block expressions
|
||||
|
||||
```antlr
|
||||
block_expr : '{' [ stmt | item ] *
|
||||
[ expr ] '}' ;
|
||||
```
|
||||
|
||||
### Method-call expressions
|
||||
|
||||
```antlr
|
||||
method_call_expr : expr '.' ident paren_expr_list ;
|
||||
```
|
||||
|
||||
### Field expressions
|
||||
|
||||
```antlr
|
||||
field_expr : expr '.' ident ;
|
||||
```
|
||||
|
||||
### Array expressions
|
||||
|
||||
```antlr
|
||||
array_expr : '[' "mut" ? array_elems? ']' ;
|
||||
|
||||
array_elems : [expr [',' expr]*] | [expr ';' expr] ;
|
||||
```
|
||||
|
||||
### Index expressions
|
||||
|
||||
```antlr
|
||||
idx_expr : expr '[' expr ']' ;
|
||||
```
|
||||
|
||||
### Range expressions
|
||||
|
||||
```antlr
|
||||
range_expr : expr ".." expr |
|
||||
expr ".." |
|
||||
".." expr |
|
||||
".." ;
|
||||
```
|
||||
|
||||
### Unary operator expressions
|
||||
|
||||
```antlr
|
||||
unop_expr : unop expr ;
|
||||
unop : '-' | '*' | '!' ;
|
||||
```
|
||||
|
||||
### Binary operator expressions
|
||||
|
||||
```antlr
|
||||
binop_expr : expr binop expr | type_cast_expr
|
||||
| assignment_expr | compound_assignment_expr ;
|
||||
binop : arith_op | bitwise_op | lazy_bool_op | comp_op
|
||||
```
|
||||
|
||||
#### Arithmetic operators
|
||||
|
||||
```antlr
|
||||
arith_op : '+' | '-' | '*' | '/' | '%' ;
|
||||
```
|
||||
|
||||
#### Bitwise operators
|
||||
|
||||
```antlr
|
||||
bitwise_op : '&' | '|' | '^' | "<<" | ">>" ;
|
||||
```
|
||||
|
||||
#### Lazy boolean operators
|
||||
|
||||
```antlr
|
||||
lazy_bool_op : "&&" | "||" ;
|
||||
```
|
||||
|
||||
#### Comparison operators
|
||||
|
||||
```antlr
|
||||
comp_op : "==" | "!=" | '<' | '>' | "<=" | ">=" ;
|
||||
```
|
||||
|
||||
#### Type cast expressions
|
||||
|
||||
```antlr
|
||||
type_cast_expr : value "as" type ;
|
||||
```
|
||||
|
||||
#### Assignment expressions
|
||||
|
||||
```antlr
|
||||
assignment_expr : expr '=' expr ;
|
||||
```
|
||||
|
||||
#### Compound assignment expressions
|
||||
|
||||
```antlr
|
||||
compound_assignment_expr : expr [ arith_op | bitwise_op ] '=' expr ;
|
||||
```
|
||||
|
||||
### Grouped expressions
|
||||
|
||||
```antlr
|
||||
paren_expr : '(' expr ')' ;
|
||||
```
|
||||
|
||||
### Call expressions
|
||||
|
||||
```antlr
|
||||
expr_list : [ expr [ ',' expr ]* ] ? ;
|
||||
paren_expr_list : '(' expr_list ')' ;
|
||||
call_expr : expr paren_expr_list ;
|
||||
```
|
||||
|
||||
### Lambda expressions
|
||||
|
||||
```antlr
|
||||
ident_list : [ ident [ ',' ident ]* ] ? ;
|
||||
lambda_expr : '|' ident_list '|' expr ;
|
||||
```
|
||||
|
||||
### While loops
|
||||
|
||||
```antlr
|
||||
while_expr : [ lifetime ':' ] ? "while" no_struct_literal_expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### Infinite loops
|
||||
|
||||
```antlr
|
||||
loop_expr : [ lifetime ':' ] ? "loop" '{' block '}';
|
||||
```
|
||||
|
||||
### Break expressions
|
||||
|
||||
```antlr
|
||||
break_expr : "break" [ lifetime ] ?;
|
||||
```
|
||||
|
||||
### Continue expressions
|
||||
|
||||
```antlr
|
||||
continue_expr : "continue" [ lifetime ] ?;
|
||||
```
|
||||
|
||||
### For expressions
|
||||
|
||||
```antlr
|
||||
for_expr : [ lifetime ':' ] ? "for" pat "in" no_struct_literal_expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### If expressions
|
||||
|
||||
```antlr
|
||||
if_expr : "if" no_struct_literal_expr '{' block '}'
|
||||
else_tail ? ;
|
||||
|
||||
else_tail : "else" [ if_expr | if_let_expr
|
||||
| '{' block '}' ] ;
|
||||
```
|
||||
|
||||
### Match expressions
|
||||
|
||||
```antlr
|
||||
match_expr : "match" no_struct_literal_expr '{' match_arm * '}' ;
|
||||
|
||||
match_arm : attribute * match_pat "=>" [ expr "," | '{' block '}' ] ;
|
||||
|
||||
match_pat : pat [ '|' pat ] * [ "if" expr ] ? ;
|
||||
```
|
||||
|
||||
### If let expressions
|
||||
|
||||
```antlr
|
||||
if_let_expr : "if" "let" pat '=' expr '{' block '}'
|
||||
else_tail ? ;
|
||||
```
|
||||
|
||||
### While let loops
|
||||
|
||||
```antlr
|
||||
while_let_expr : [ lifetime ':' ] ? "while" "let" pat '=' expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### Return expressions
|
||||
|
||||
```antlr
|
||||
return_expr : "return" expr ? ;
|
||||
```
|
||||
|
||||
# Type system
|
||||
|
||||
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
|
||||
|
||||
## Types
|
||||
|
||||
### Primitive types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Machine types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Machine-dependent integer types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Textual types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Tuple types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Array, and Slice types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Structure types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Enumerated types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Pointer types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Function types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Closure types
|
||||
|
||||
```antlr
|
||||
closure_type := [ 'unsafe' ] [ '<' lifetime-list '>' ] '|' arg-list '|'
|
||||
[ ':' bound-list ] [ '->' type ]
|
||||
lifetime-list := lifetime | lifetime ',' lifetime-list
|
||||
arg-list := ident ':' type | ident ':' type ',' arg-list
|
||||
```
|
||||
|
||||
### Never type
|
||||
An empty type
|
||||
|
||||
```antlr
|
||||
never_type : "!" ;
|
||||
```
|
||||
|
||||
### Object types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type parameters
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type parameter bounds
|
||||
|
||||
```antlr
|
||||
bound-list := bound | bound '+' bound-list '+' ?
|
||||
bound := ty_bound | lt_bound
|
||||
lt_bound := lifetime
|
||||
ty_bound := ty_bound_noparen | (ty_bound_noparen)
|
||||
ty_bound_noparen := [?] [ for<lt_param_defs> ] simple_path
|
||||
```
|
||||
|
||||
### Self types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
## Type kinds
|
||||
|
||||
**FIXME:** this is probably not relevant to the grammar...
|
||||
|
||||
# Memory and concurrency models
|
||||
|
||||
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
|
||||
|
||||
## Memory model
|
||||
|
||||
### Memory allocation and lifetime
|
||||
|
||||
### Memory ownership
|
||||
|
||||
### Variables
|
||||
|
||||
### Boxes
|
||||
|
||||
## Threads
|
||||
|
||||
### Communication between threads
|
||||
|
||||
### Thread lifecycle
|
||||
[reference]: https://doc.rust-lang.org/reference/
|
||||
[grammar working group]: https://github.com/rust-lang/wg-grammar
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
*.class
|
||||
*.java
|
||||
*.tokens
|
||||
@@ -1,350 +0,0 @@
|
||||
%{
|
||||
#include <stdio.h>
|
||||
#include <ctype.h>
|
||||
|
||||
static int num_hashes;
|
||||
static int end_hashes;
|
||||
static int saw_non_hash;
|
||||
|
||||
%}
|
||||
|
||||
%option stack
|
||||
%option yylineno
|
||||
|
||||
%x str
|
||||
%x rawstr
|
||||
%x rawstr_esc_begin
|
||||
%x rawstr_esc_body
|
||||
%x rawstr_esc_end
|
||||
%x byte
|
||||
%x bytestr
|
||||
%x rawbytestr
|
||||
%x rawbytestr_nohash
|
||||
%x pound
|
||||
%x shebang_or_attr
|
||||
%x ltorchar
|
||||
%x linecomment
|
||||
%x doc_line
|
||||
%x blockcomment
|
||||
%x doc_block
|
||||
%x suffix
|
||||
|
||||
ident [a-zA-Z\x80-\xff_][a-zA-Z0-9\x80-\xff_]*
|
||||
|
||||
%%
|
||||
|
||||
<suffix>{ident} { BEGIN(INITIAL); }
|
||||
<suffix>(.|\n) { yyless(0); BEGIN(INITIAL); }
|
||||
|
||||
[ \n\t\r] { }
|
||||
|
||||
\xef\xbb\xbf {
|
||||
// UTF-8 byte order mark (BOM), ignore if in line 1, error otherwise
|
||||
if (yyget_lineno() != 1) {
|
||||
return -1;
|
||||
}
|
||||
}
|
||||
|
||||
\/\/(\/|\!) { BEGIN(doc_line); yymore(); }
|
||||
<doc_line>\n { BEGIN(INITIAL);
|
||||
yyleng--;
|
||||
yytext[yyleng] = 0;
|
||||
return ((yytext[2] == '!') ? INNER_DOC_COMMENT : OUTER_DOC_COMMENT);
|
||||
}
|
||||
<doc_line>[^\n]* { yymore(); }
|
||||
|
||||
\/\/|\/\/\/\/ { BEGIN(linecomment); }
|
||||
<linecomment>\n { BEGIN(INITIAL); }
|
||||
<linecomment>[^\n]* { }
|
||||
|
||||
\/\*(\*|\!)[^*] { yy_push_state(INITIAL); yy_push_state(doc_block); yymore(); }
|
||||
<doc_block>\/\* { yy_push_state(doc_block); yymore(); }
|
||||
<doc_block>\*\/ {
|
||||
yy_pop_state();
|
||||
if (yy_top_state() == doc_block) {
|
||||
yymore();
|
||||
} else {
|
||||
return ((yytext[2] == '!') ? INNER_DOC_COMMENT : OUTER_DOC_COMMENT);
|
||||
}
|
||||
}
|
||||
<doc_block>(.|\n) { yymore(); }
|
||||
|
||||
\/\* { yy_push_state(blockcomment); }
|
||||
<blockcomment>\/\* { yy_push_state(blockcomment); }
|
||||
<blockcomment>\*\/ { yy_pop_state(); }
|
||||
<blockcomment>(.|\n) { }
|
||||
|
||||
_ { return UNDERSCORE; }
|
||||
abstract { return ABSTRACT; }
|
||||
alignof { return ALIGNOF; }
|
||||
as { return AS; }
|
||||
become { return BECOME; }
|
||||
box { return BOX; }
|
||||
break { return BREAK; }
|
||||
catch { return CATCH; }
|
||||
const { return CONST; }
|
||||
continue { return CONTINUE; }
|
||||
crate { return CRATE; }
|
||||
default { return DEFAULT; }
|
||||
do { return DO; }
|
||||
else { return ELSE; }
|
||||
enum { return ENUM; }
|
||||
extern { return EXTERN; }
|
||||
false { return FALSE; }
|
||||
final { return FINAL; }
|
||||
fn { return FN; }
|
||||
for { return FOR; }
|
||||
if { return IF; }
|
||||
impl { return IMPL; }
|
||||
in { return IN; }
|
||||
let { return LET; }
|
||||
loop { return LOOP; }
|
||||
macro { return MACRO; }
|
||||
match { return MATCH; }
|
||||
mod { return MOD; }
|
||||
move { return MOVE; }
|
||||
mut { return MUT; }
|
||||
offsetof { return OFFSETOF; }
|
||||
override { return OVERRIDE; }
|
||||
priv { return PRIV; }
|
||||
proc { return PROC; }
|
||||
pure { return PURE; }
|
||||
pub { return PUB; }
|
||||
ref { return REF; }
|
||||
return { return RETURN; }
|
||||
self { return SELF; }
|
||||
sizeof { return SIZEOF; }
|
||||
static { return STATIC; }
|
||||
struct { return STRUCT; }
|
||||
super { return SUPER; }
|
||||
trait { return TRAIT; }
|
||||
true { return TRUE; }
|
||||
type { return TYPE; }
|
||||
typeof { return TYPEOF; }
|
||||
union { return UNION; }
|
||||
unsafe { return UNSAFE; }
|
||||
unsized { return UNSIZED; }
|
||||
use { return USE; }
|
||||
virtual { return VIRTUAL; }
|
||||
where { return WHERE; }
|
||||
while { return WHILE; }
|
||||
yield { return YIELD; }
|
||||
|
||||
{ident} { return IDENT; }
|
||||
|
||||
0x[0-9a-fA-F_]+ { BEGIN(suffix); return LIT_INTEGER; }
|
||||
0o[0-7_]+ { BEGIN(suffix); return LIT_INTEGER; }
|
||||
0b[01_]+ { BEGIN(suffix); return LIT_INTEGER; }
|
||||
[0-9][0-9_]* { BEGIN(suffix); return LIT_INTEGER; }
|
||||
[0-9][0-9_]*\.(\.|[a-zA-Z]) { yyless(yyleng - 2); BEGIN(suffix); return LIT_INTEGER; }
|
||||
|
||||
[0-9][0-9_]*\.[0-9_]*([eE][-\+]?[0-9_]+)? { BEGIN(suffix); return LIT_FLOAT; }
|
||||
[0-9][0-9_]*(\.[0-9_]*)?[eE][-\+]?[0-9_]+ { BEGIN(suffix); return LIT_FLOAT; }
|
||||
|
||||
; { return ';'; }
|
||||
, { return ','; }
|
||||
\.\.\. { return DOTDOTDOT; }
|
||||
\.\. { return DOTDOT; }
|
||||
\. { return '.'; }
|
||||
\( { return '('; }
|
||||
\) { return ')'; }
|
||||
\{ { return '{'; }
|
||||
\} { return '}'; }
|
||||
\[ { return '['; }
|
||||
\] { return ']'; }
|
||||
@ { return '@'; }
|
||||
# { BEGIN(pound); yymore(); }
|
||||
<pound>\! { BEGIN(shebang_or_attr); yymore(); }
|
||||
<shebang_or_attr>\[ {
|
||||
BEGIN(INITIAL);
|
||||
yyless(2);
|
||||
return SHEBANG;
|
||||
}
|
||||
<shebang_or_attr>[^\[\n]*\n {
|
||||
// Since the \n was eaten as part of the token, yylineno will have
|
||||
// been incremented to the value 2 if the shebang was on the first
|
||||
// line. This yyless undoes that, setting yylineno back to 1.
|
||||
yyless(yyleng - 1);
|
||||
if (yyget_lineno() == 1) {
|
||||
BEGIN(INITIAL);
|
||||
return SHEBANG_LINE;
|
||||
} else {
|
||||
BEGIN(INITIAL);
|
||||
yyless(2);
|
||||
return SHEBANG;
|
||||
}
|
||||
}
|
||||
<pound>. { BEGIN(INITIAL); yyless(1); return '#'; }
|
||||
|
||||
\~ { return '~'; }
|
||||
:: { return MOD_SEP; }
|
||||
: { return ':'; }
|
||||
\$ { return '$'; }
|
||||
\? { return '?'; }
|
||||
|
||||
== { return EQEQ; }
|
||||
=> { return FAT_ARROW; }
|
||||
= { return '='; }
|
||||
\!= { return NE; }
|
||||
\! { return '!'; }
|
||||
\<= { return LE; }
|
||||
\<\< { return SHL; }
|
||||
\<\<= { return SHLEQ; }
|
||||
\< { return '<'; }
|
||||
\>= { return GE; }
|
||||
\>\> { return SHR; }
|
||||
\>\>= { return SHREQ; }
|
||||
\> { return '>'; }
|
||||
|
||||
\x27 { BEGIN(ltorchar); yymore(); }
|
||||
<ltorchar>static { BEGIN(INITIAL); return STATIC_LIFETIME; }
|
||||
<ltorchar>{ident} { BEGIN(INITIAL); return LIFETIME; }
|
||||
<ltorchar>\\[nrt\\\x27\x220]\x27 { BEGIN(suffix); return LIT_CHAR; }
|
||||
<ltorchar>\\x[0-9a-fA-F]{2}\x27 { BEGIN(suffix); return LIT_CHAR; }
|
||||
<ltorchar>\\u\{([0-9a-fA-F]_*){1,6}\}\x27 { BEGIN(suffix); return LIT_CHAR; }
|
||||
<ltorchar>.\x27 { BEGIN(suffix); return LIT_CHAR; }
|
||||
<ltorchar>[\x80-\xff]{2,4}\x27 { BEGIN(suffix); return LIT_CHAR; }
|
||||
<ltorchar><<EOF>> { BEGIN(INITIAL); return -1; }
|
||||
|
||||
b\x22 { BEGIN(bytestr); yymore(); }
|
||||
<bytestr>\x22 { BEGIN(suffix); return LIT_BYTE_STR; }
|
||||
|
||||
<bytestr><<EOF>> { return -1; }
|
||||
<bytestr>\\[n\nrt\\\x27\x220] { yymore(); }
|
||||
<bytestr>\\x[0-9a-fA-F]{2} { yymore(); }
|
||||
<bytestr>\\u\{([0-9a-fA-F]_*){1,6}\} { yymore(); }
|
||||
<bytestr>\\[^n\nrt\\\x27\x220] { return -1; }
|
||||
<bytestr>(.|\n) { yymore(); }
|
||||
|
||||
br\x22 { BEGIN(rawbytestr_nohash); yymore(); }
|
||||
<rawbytestr_nohash>\x22 { BEGIN(suffix); return LIT_BYTE_STR_RAW; }
|
||||
<rawbytestr_nohash>(.|\n) { yymore(); }
|
||||
<rawbytestr_nohash><<EOF>> { return -1; }
|
||||
|
||||
br/# {
|
||||
BEGIN(rawbytestr);
|
||||
yymore();
|
||||
num_hashes = 0;
|
||||
saw_non_hash = 0;
|
||||
end_hashes = 0;
|
||||
}
|
||||
<rawbytestr># {
|
||||
if (!saw_non_hash) {
|
||||
num_hashes++;
|
||||
} else if (end_hashes != 0) {
|
||||
end_hashes++;
|
||||
if (end_hashes == num_hashes) {
|
||||
BEGIN(INITIAL);
|
||||
return LIT_BYTE_STR_RAW;
|
||||
}
|
||||
}
|
||||
yymore();
|
||||
}
|
||||
<rawbytestr>\x22# {
|
||||
end_hashes = 1;
|
||||
if (end_hashes == num_hashes) {
|
||||
BEGIN(INITIAL);
|
||||
return LIT_BYTE_STR_RAW;
|
||||
}
|
||||
yymore();
|
||||
}
|
||||
<rawbytestr>(.|\n) {
|
||||
if (!saw_non_hash) {
|
||||
saw_non_hash = 1;
|
||||
}
|
||||
if (end_hashes != 0) {
|
||||
end_hashes = 0;
|
||||
}
|
||||
yymore();
|
||||
}
|
||||
<rawbytestr><<EOF>> { return -1; }
|
||||
|
||||
b\x27 { BEGIN(byte); yymore(); }
|
||||
<byte>\\[nrt\\\x27\x220]\x27 { BEGIN(INITIAL); return LIT_BYTE; }
|
||||
<byte>\\x[0-9a-fA-F]{2}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
|
||||
<byte>\\u([0-9a-fA-F]_*){4}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
|
||||
<byte>\\U([0-9a-fA-F]_*){8}\x27 { BEGIN(INITIAL); return LIT_BYTE; }
|
||||
<byte>.\x27 { BEGIN(INITIAL); return LIT_BYTE; }
|
||||
<byte><<EOF>> { BEGIN(INITIAL); return -1; }
|
||||
|
||||
r\x22 { BEGIN(rawstr); yymore(); }
|
||||
<rawstr>\x22 { BEGIN(suffix); return LIT_STR_RAW; }
|
||||
<rawstr>(.|\n) { yymore(); }
|
||||
<rawstr><<EOF>> { return -1; }
|
||||
|
||||
r/# {
|
||||
BEGIN(rawstr_esc_begin);
|
||||
yymore();
|
||||
num_hashes = 0;
|
||||
saw_non_hash = 0;
|
||||
end_hashes = 0;
|
||||
}
|
||||
|
||||
<rawstr_esc_begin># {
|
||||
num_hashes++;
|
||||
yymore();
|
||||
}
|
||||
<rawstr_esc_begin>\x22 {
|
||||
BEGIN(rawstr_esc_body);
|
||||
yymore();
|
||||
}
|
||||
<rawstr_esc_begin>(.|\n) { return -1; }
|
||||
|
||||
<rawstr_esc_body>\x22/# {
|
||||
BEGIN(rawstr_esc_end);
|
||||
yymore();
|
||||
}
|
||||
<rawstr_esc_body>(.|\n) {
|
||||
yymore();
|
||||
}
|
||||
|
||||
<rawstr_esc_end># {
|
||||
end_hashes++;
|
||||
if (end_hashes == num_hashes) {
|
||||
BEGIN(INITIAL);
|
||||
return LIT_STR_RAW;
|
||||
}
|
||||
yymore();
|
||||
}
|
||||
<rawstr_esc_end>[^#] {
|
||||
end_hashes = 0;
|
||||
BEGIN(rawstr_esc_body);
|
||||
yymore();
|
||||
}
|
||||
|
||||
<rawstr_esc_begin,rawstr_esc_body,rawstr_esc_end><<EOF>> { return -1; }
|
||||
|
||||
\x22 { BEGIN(str); yymore(); }
|
||||
<str>\x22 { BEGIN(suffix); return LIT_STR; }
|
||||
|
||||
<str><<EOF>> { return -1; }
|
||||
<str>\\[n\nr\rt\\\x27\x220] { yymore(); }
|
||||
<str>\\x[0-9a-fA-F]{2} { yymore(); }
|
||||
<str>\\u\{([0-9a-fA-F]_*){1,6}\} { yymore(); }
|
||||
<str>\\[^n\nrt\\\x27\x220] { return -1; }
|
||||
<str>(.|\n) { yymore(); }
|
||||
|
||||
\<- { return LARROW; }
|
||||
-\> { return RARROW; }
|
||||
- { return '-'; }
|
||||
-= { return MINUSEQ; }
|
||||
&& { return ANDAND; }
|
||||
& { return '&'; }
|
||||
&= { return ANDEQ; }
|
||||
\|\| { return OROR; }
|
||||
\| { return '|'; }
|
||||
\|= { return OREQ; }
|
||||
\+ { return '+'; }
|
||||
\+= { return PLUSEQ; }
|
||||
\* { return '*'; }
|
||||
\*= { return STAREQ; }
|
||||
\/ { return '/'; }
|
||||
\/= { return SLASHEQ; }
|
||||
\^ { return '^'; }
|
||||
\^= { return CARETEQ; }
|
||||
% { return '%'; }
|
||||
%= { return PERCENTEQ; }
|
||||
|
||||
<<EOF>> { return 0; }
|
||||
|
||||
%%
|
||||
@@ -1,193 +0,0 @@
|
||||
#include <stdio.h>
|
||||
#include <stdarg.h>
|
||||
#include <stdlib.h>
|
||||
#include <string.h>
|
||||
|
||||
extern int yylex();
|
||||
extern int rsparse();
|
||||
|
||||
#define PUSHBACK_LEN 4
|
||||
|
||||
static char pushback[PUSHBACK_LEN];
|
||||
static int verbose;
|
||||
|
||||
void print(const char* format, ...) {
|
||||
va_list args;
|
||||
va_start(args, format);
|
||||
if (verbose) {
|
||||
vprintf(format, args);
|
||||
}
|
||||
va_end(args);
|
||||
}
|
||||
|
||||
// If there is a non-null char at the head of the pushback queue,
|
||||
// dequeue it and shift the rest of the queue forwards. Otherwise,
|
||||
// return the token from calling yylex.
|
||||
int rslex() {
|
||||
if (pushback[0] == '\0') {
|
||||
return yylex();
|
||||
} else {
|
||||
char c = pushback[0];
|
||||
memmove(pushback, pushback + 1, PUSHBACK_LEN - 1);
|
||||
pushback[PUSHBACK_LEN - 1] = '\0';
|
||||
return c;
|
||||
}
|
||||
}
|
||||
|
||||
// Note: this does nothing if the pushback queue is full. As long as
|
||||
// there aren't more than PUSHBACK_LEN consecutive calls to push_back
|
||||
// in an action, this shouldn't be a problem.
|
||||
void push_back(char c) {
|
||||
for (int i = 0; i < PUSHBACK_LEN; ++i) {
|
||||
if (pushback[i] == '\0') {
|
||||
pushback[i] = c;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
extern int rsdebug;
|
||||
|
||||
struct node {
|
||||
struct node *next;
|
||||
struct node *prev;
|
||||
int own_string;
|
||||
char const *name;
|
||||
int n_elems;
|
||||
struct node *elems[];
|
||||
};
|
||||
|
||||
struct node *nodes = NULL;
|
||||
int n_nodes;
|
||||
|
||||
struct node *mk_node(char const *name, int n, ...) {
|
||||
va_list ap;
|
||||
int i = 0;
|
||||
unsigned sz = sizeof(struct node) + (n * sizeof(struct node *));
|
||||
struct node *nn, *nd = (struct node *)malloc(sz);
|
||||
|
||||
print("# New %d-ary node: %s = %p\n", n, name, nd);
|
||||
|
||||
nd->own_string = 0;
|
||||
nd->prev = NULL;
|
||||
nd->next = nodes;
|
||||
if (nodes) {
|
||||
nodes->prev = nd;
|
||||
}
|
||||
nodes = nd;
|
||||
|
||||
nd->name = name;
|
||||
nd->n_elems = n;
|
||||
|
||||
va_start(ap, n);
|
||||
while (i < n) {
|
||||
nn = va_arg(ap, struct node *);
|
||||
print("# arg[%d]: %p\n", i, nn);
|
||||
print("# (%s ...)\n", nn->name);
|
||||
nd->elems[i++] = nn;
|
||||
}
|
||||
va_end(ap);
|
||||
n_nodes++;
|
||||
return nd;
|
||||
}
|
||||
|
||||
struct node *mk_atom(char *name) {
|
||||
struct node *nd = mk_node((char const *)strdup(name), 0);
|
||||
nd->own_string = 1;
|
||||
return nd;
|
||||
}
|
||||
|
||||
struct node *mk_none() {
|
||||
return mk_atom("<none>");
|
||||
}
|
||||
|
||||
struct node *ext_node(struct node *nd, int n, ...) {
|
||||
va_list ap;
|
||||
int i = 0, c = nd->n_elems + n;
|
||||
unsigned sz = sizeof(struct node) + (c * sizeof(struct node *));
|
||||
struct node *nn;
|
||||
|
||||
print("# Extending %d-ary node by %d nodes: %s = %p",
|
||||
nd->n_elems, c, nd->name, nd);
|
||||
|
||||
if (nd->next) {
|
||||
nd->next->prev = nd->prev;
|
||||
}
|
||||
if (nd->prev) {
|
||||
nd->prev->next = nd->next;
|
||||
}
|
||||
nd = realloc(nd, sz);
|
||||
nd->prev = NULL;
|
||||
nd->next = nodes;
|
||||
nodes->prev = nd;
|
||||
nodes = nd;
|
||||
|
||||
print(" ==> %p\n", nd);
|
||||
|
||||
va_start(ap, n);
|
||||
while (i < n) {
|
||||
nn = va_arg(ap, struct node *);
|
||||
print("# arg[%d]: %p\n", i, nn);
|
||||
print("# (%s ...)\n", nn->name);
|
||||
nd->elems[nd->n_elems++] = nn;
|
||||
++i;
|
||||
}
|
||||
va_end(ap);
|
||||
return nd;
|
||||
}
|
||||
|
||||
int const indent_step = 4;
|
||||
|
||||
void print_indent(int depth) {
|
||||
while (depth) {
|
||||
if (depth-- % indent_step == 0) {
|
||||
print("|");
|
||||
} else {
|
||||
print(" ");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void print_node(struct node *n, int depth) {
|
||||
int i = 0;
|
||||
print_indent(depth);
|
||||
if (n->n_elems == 0) {
|
||||
print("%s\n", n->name);
|
||||
} else {
|
||||
print("(%s\n", n->name);
|
||||
for (i = 0; i < n->n_elems; ++i) {
|
||||
print_node(n->elems[i], depth + indent_step);
|
||||
}
|
||||
print_indent(depth);
|
||||
print(")\n");
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char **argv) {
|
||||
if (argc == 2 && strcmp(argv[1], "-v") == 0) {
|
||||
verbose = 1;
|
||||
} else {
|
||||
verbose = 0;
|
||||
}
|
||||
int ret = 0;
|
||||
struct node *tmp;
|
||||
memset(pushback, '\0', PUSHBACK_LEN);
|
||||
ret = rsparse();
|
||||
print("--- PARSE COMPLETE: ret:%d, n_nodes:%d ---\n", ret, n_nodes);
|
||||
if (nodes) {
|
||||
print_node(nodes, 0);
|
||||
}
|
||||
while (nodes) {
|
||||
tmp = nodes;
|
||||
nodes = tmp->next;
|
||||
if (tmp->own_string) {
|
||||
free((void*)tmp->name);
|
||||
}
|
||||
free(tmp);
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
void rserror(char const *s) {
|
||||
fprintf(stderr, "%s\n", s);
|
||||
}
|
||||
@@ -1,1982 +0,0 @@
|
||||
%{
|
||||
#define YYERROR_VERBOSE
|
||||
#define YYSTYPE struct node *
|
||||
struct node;
|
||||
extern int yylex();
|
||||
extern void yyerror(char const *s);
|
||||
extern struct node *mk_node(char const *name, int n, ...);
|
||||
extern struct node *mk_atom(char *text);
|
||||
extern struct node *mk_none();
|
||||
extern struct node *ext_node(struct node *nd, int n, ...);
|
||||
extern void push_back(char c);
|
||||
extern char *yytext;
|
||||
%}
|
||||
%debug
|
||||
|
||||
%token SHL
|
||||
%token SHR
|
||||
%token LE
|
||||
%token EQEQ
|
||||
%token NE
|
||||
%token GE
|
||||
%token ANDAND
|
||||
%token OROR
|
||||
%token SHLEQ
|
||||
%token SHREQ
|
||||
%token MINUSEQ
|
||||
%token ANDEQ
|
||||
%token OREQ
|
||||
%token PLUSEQ
|
||||
%token STAREQ
|
||||
%token SLASHEQ
|
||||
%token CARETEQ
|
||||
%token PERCENTEQ
|
||||
%token DOTDOT
|
||||
%token DOTDOTDOT
|
||||
%token MOD_SEP
|
||||
%token RARROW
|
||||
%token LARROW
|
||||
%token FAT_ARROW
|
||||
%token LIT_BYTE
|
||||
%token LIT_CHAR
|
||||
%token LIT_INTEGER
|
||||
%token LIT_FLOAT
|
||||
%token LIT_STR
|
||||
%token LIT_STR_RAW
|
||||
%token LIT_BYTE_STR
|
||||
%token LIT_BYTE_STR_RAW
|
||||
%token IDENT
|
||||
%token UNDERSCORE
|
||||
%token LIFETIME
|
||||
|
||||
// keywords
|
||||
%token SELF
|
||||
%token STATIC
|
||||
%token ABSTRACT
|
||||
%token ALIGNOF
|
||||
%token AS
|
||||
%token BECOME
|
||||
%token BREAK
|
||||
%token CATCH
|
||||
%token CRATE
|
||||
%token DO
|
||||
%token ELSE
|
||||
%token ENUM
|
||||
%token EXTERN
|
||||
%token FALSE
|
||||
%token FINAL
|
||||
%token FN
|
||||
%token FOR
|
||||
%token IF
|
||||
%token IMPL
|
||||
%token IN
|
||||
%token LET
|
||||
%token LOOP
|
||||
%token MACRO
|
||||
%token MATCH
|
||||
%token MOD
|
||||
%token MOVE
|
||||
%token MUT
|
||||
%token OFFSETOF
|
||||
%token OVERRIDE
|
||||
%token PRIV
|
||||
%token PUB
|
||||
%token PURE
|
||||
%token REF
|
||||
%token RETURN
|
||||
%token SIZEOF
|
||||
%token STRUCT
|
||||
%token SUPER
|
||||
%token UNION
|
||||
%token UNSIZED
|
||||
%token TRUE
|
||||
%token TRAIT
|
||||
%token TYPE
|
||||
%token UNSAFE
|
||||
%token VIRTUAL
|
||||
%token YIELD
|
||||
%token DEFAULT
|
||||
%token USE
|
||||
%token WHILE
|
||||
%token CONTINUE
|
||||
%token PROC
|
||||
%token BOX
|
||||
%token CONST
|
||||
%token WHERE
|
||||
%token TYPEOF
|
||||
%token INNER_DOC_COMMENT
|
||||
%token OUTER_DOC_COMMENT
|
||||
|
||||
%token SHEBANG
|
||||
%token SHEBANG_LINE
|
||||
%token STATIC_LIFETIME
|
||||
|
||||
/*
|
||||
Quoting from the Bison manual:
|
||||
|
||||
"Finally, the resolution of conflicts works by comparing the precedence
|
||||
of the rule being considered with that of the lookahead token. If the
|
||||
token's precedence is higher, the choice is to shift. If the rule's
|
||||
precedence is higher, the choice is to reduce. If they have equal
|
||||
precedence, the choice is made based on the associativity of that
|
||||
precedence level. The verbose output file made by ‘-v’ (see Invoking
|
||||
Bison) says how each conflict was resolved"
|
||||
*/
|
||||
|
||||
// We expect no shift/reduce or reduce/reduce conflicts in this grammar;
|
||||
// all potential ambiguities are scrutinized and eliminated manually.
|
||||
%expect 0
|
||||
|
||||
// fake-precedence symbol to cause '|' bars in lambda context to parse
|
||||
// at low precedence, permit things like |x| foo = bar, where '=' is
|
||||
// otherwise lower-precedence than '|'. Also used for proc() to cause
|
||||
// things like proc() a + b to parse as proc() { a + b }.
|
||||
%precedence LAMBDA
|
||||
|
||||
%precedence SELF
|
||||
|
||||
// MUT should be lower precedence than IDENT so that in the pat rule,
|
||||
// "& MUT pat" has higher precedence than "binding_mode ident [@ pat]"
|
||||
%precedence MUT
|
||||
|
||||
// IDENT needs to be lower than '{' so that 'foo {' is shifted when
|
||||
// trying to decide if we've got a struct-construction expr (esp. in
|
||||
// contexts like 'if foo { .')
|
||||
//
|
||||
// IDENT also needs to be lower precedence than '<' so that '<' in
|
||||
// 'foo:bar . <' is shifted (in a trait reference occurring in a
|
||||
// bounds list), parsing as foo:(bar<baz>) rather than (foo:bar)<baz>.
|
||||
%precedence IDENT
|
||||
// Put the weak keywords that can be used as idents here as well
|
||||
%precedence CATCH
|
||||
%precedence DEFAULT
|
||||
%precedence UNION
|
||||
|
||||
// A couple fake-precedence symbols to use in rules associated with +
|
||||
// and < in trailing type contexts. These come up when you have a type
|
||||
// in the RHS of operator-AS, such as "foo as bar<baz>". The "<" there
|
||||
// has to be shifted so the parser keeps trying to parse a type, even
|
||||
// though it might well consider reducing the type "bar" and then
|
||||
// going on to "<" as a subsequent binop. The "+" case is with
|
||||
// trailing type-bounds ("foo as bar:A+B"), for the same reason.
|
||||
%precedence SHIFTPLUS
|
||||
|
||||
%precedence MOD_SEP
|
||||
%precedence RARROW ':'
|
||||
|
||||
// In where clauses, "for" should have greater precedence when used as
|
||||
// a higher ranked constraint than when used as the beginning of a
|
||||
// for_in_type (which is a ty)
|
||||
%precedence FORTYPE
|
||||
%precedence FOR
|
||||
|
||||
// Binops & unops, and their precedences
|
||||
%precedence '?'
|
||||
%precedence BOX
|
||||
%nonassoc DOTDOT
|
||||
|
||||
// RETURN needs to be lower-precedence than tokens that start
|
||||
// prefix_exprs
|
||||
%precedence RETURN YIELD
|
||||
|
||||
%right '=' SHLEQ SHREQ MINUSEQ ANDEQ OREQ PLUSEQ STAREQ SLASHEQ CARETEQ PERCENTEQ
|
||||
%right LARROW
|
||||
%left OROR
|
||||
%left ANDAND
|
||||
%left EQEQ NE
|
||||
%left '<' '>' LE GE
|
||||
%left '|'
|
||||
%left '^'
|
||||
%left '&'
|
||||
%left SHL SHR
|
||||
%left '+' '-'
|
||||
%precedence AS
|
||||
%left '*' '/' '%'
|
||||
%precedence '!'
|
||||
|
||||
%precedence '{' '[' '(' '.'
|
||||
|
||||
%precedence RANGE
|
||||
|
||||
%start crate
|
||||
|
||||
%%
|
||||
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
// Part 1: Items and attributes
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
|
||||
crate
|
||||
: maybe_shebang inner_attrs maybe_mod_items { mk_node("crate", 2, $2, $3); }
|
||||
| maybe_shebang maybe_mod_items { mk_node("crate", 1, $2); }
|
||||
;
|
||||
|
||||
maybe_shebang
|
||||
: SHEBANG_LINE
|
||||
| %empty
|
||||
;
|
||||
|
||||
maybe_inner_attrs
|
||||
: inner_attrs
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
inner_attrs
|
||||
: inner_attr { $$ = mk_node("InnerAttrs", 1, $1); }
|
||||
| inner_attrs inner_attr { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
inner_attr
|
||||
: SHEBANG '[' meta_item ']' { $$ = mk_node("InnerAttr", 1, $3); }
|
||||
| INNER_DOC_COMMENT { $$ = mk_node("InnerAttr", 1, mk_node("doc-comment", 1, mk_atom(yytext))); }
|
||||
;
|
||||
|
||||
maybe_outer_attrs
|
||||
: outer_attrs
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
outer_attrs
|
||||
: outer_attr { $$ = mk_node("OuterAttrs", 1, $1); }
|
||||
| outer_attrs outer_attr { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
outer_attr
|
||||
: '#' '[' meta_item ']' { $$ = $3; }
|
||||
| OUTER_DOC_COMMENT { $$ = mk_node("doc-comment", 1, mk_atom(yytext)); }
|
||||
;
|
||||
|
||||
meta_item
|
||||
: ident { $$ = mk_node("MetaWord", 1, $1); }
|
||||
| ident '=' lit { $$ = mk_node("MetaNameValue", 2, $1, $3); }
|
||||
| ident '(' meta_seq ')' { $$ = mk_node("MetaList", 2, $1, $3); }
|
||||
| ident '(' meta_seq ',' ')' { $$ = mk_node("MetaList", 2, $1, $3); }
|
||||
;
|
||||
|
||||
meta_seq
|
||||
: %empty { $$ = mk_none(); }
|
||||
| meta_item { $$ = mk_node("MetaItems", 1, $1); }
|
||||
| meta_seq ',' meta_item { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
maybe_mod_items
|
||||
: mod_items
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
mod_items
|
||||
: mod_item { $$ = mk_node("Items", 1, $1); }
|
||||
| mod_items mod_item { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
attrs_and_vis
|
||||
: maybe_outer_attrs visibility { $$ = mk_node("AttrsAndVis", 2, $1, $2); }
|
||||
;
|
||||
|
||||
mod_item
|
||||
: attrs_and_vis item { $$ = mk_node("Item", 2, $1, $2); }
|
||||
;
|
||||
|
||||
// items that can appear outside of a fn block
|
||||
item
|
||||
: stmt_item
|
||||
| item_macro
|
||||
;
|
||||
|
||||
// items that can appear in "stmts"
|
||||
stmt_item
|
||||
: item_static
|
||||
| item_const
|
||||
| item_type
|
||||
| block_item
|
||||
| view_item
|
||||
;
|
||||
|
||||
item_static
|
||||
: STATIC ident ':' ty '=' expr ';' { $$ = mk_node("ItemStatic", 3, $2, $4, $6); }
|
||||
| STATIC MUT ident ':' ty '=' expr ';' { $$ = mk_node("ItemStatic", 3, $3, $5, $7); }
|
||||
;
|
||||
|
||||
item_const
|
||||
: CONST ident ':' ty '=' expr ';' { $$ = mk_node("ItemConst", 3, $2, $4, $6); }
|
||||
;
|
||||
|
||||
item_macro
|
||||
: path_expr '!' maybe_ident parens_delimited_token_trees ';' { $$ = mk_node("ItemMacro", 3, $1, $3, $4); }
|
||||
| path_expr '!' maybe_ident braces_delimited_token_trees { $$ = mk_node("ItemMacro", 3, $1, $3, $4); }
|
||||
| path_expr '!' maybe_ident brackets_delimited_token_trees ';'{ $$ = mk_node("ItemMacro", 3, $1, $3, $4); }
|
||||
;
|
||||
|
||||
view_item
|
||||
: use_item
|
||||
| extern_fn_item
|
||||
| EXTERN CRATE ident ';' { $$ = mk_node("ViewItemExternCrate", 1, $3); }
|
||||
| EXTERN CRATE ident AS ident ';' { $$ = mk_node("ViewItemExternCrate", 2, $3, $5); }
|
||||
;
|
||||
|
||||
extern_fn_item
|
||||
: EXTERN maybe_abi item_fn { $$ = mk_node("ViewItemExternFn", 2, $2, $3); }
|
||||
;
|
||||
|
||||
use_item
|
||||
: USE view_path ';' { $$ = mk_node("ViewItemUse", 1, $2); }
|
||||
;
|
||||
|
||||
view_path
|
||||
: path_no_types_allowed { $$ = mk_node("ViewPathSimple", 1, $1); }
|
||||
| path_no_types_allowed MOD_SEP '{' '}' { $$ = mk_node("ViewPathList", 2, $1, mk_atom("ViewPathListEmpty")); }
|
||||
| MOD_SEP '{' '}' { $$ = mk_node("ViewPathList", 1, mk_atom("ViewPathListEmpty")); }
|
||||
| path_no_types_allowed MOD_SEP '{' idents_or_self '}' { $$ = mk_node("ViewPathList", 2, $1, $4); }
|
||||
| MOD_SEP '{' idents_or_self '}' { $$ = mk_node("ViewPathList", 1, $3); }
|
||||
| path_no_types_allowed MOD_SEP '{' idents_or_self ',' '}' { $$ = mk_node("ViewPathList", 2, $1, $4); }
|
||||
| MOD_SEP '{' idents_or_self ',' '}' { $$ = mk_node("ViewPathList", 1, $3); }
|
||||
| path_no_types_allowed MOD_SEP '*' { $$ = mk_node("ViewPathGlob", 1, $1); }
|
||||
| MOD_SEP '*' { $$ = mk_atom("ViewPathGlob"); }
|
||||
| '*' { $$ = mk_atom("ViewPathGlob"); }
|
||||
| '{' '}' { $$ = mk_atom("ViewPathListEmpty"); }
|
||||
| '{' idents_or_self '}' { $$ = mk_node("ViewPathList", 1, $2); }
|
||||
| '{' idents_or_self ',' '}' { $$ = mk_node("ViewPathList", 1, $2); }
|
||||
| path_no_types_allowed AS ident { $$ = mk_node("ViewPathSimple", 2, $1, $3); }
|
||||
;
|
||||
|
||||
block_item
|
||||
: item_fn
|
||||
| item_unsafe_fn
|
||||
| item_mod
|
||||
| item_foreign_mod { $$ = mk_node("ItemForeignMod", 1, $1); }
|
||||
| item_struct
|
||||
| item_enum
|
||||
| item_union
|
||||
| item_trait
|
||||
| item_impl
|
||||
;
|
||||
|
||||
maybe_ty_ascription
|
||||
: ':' ty_sum { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_init_expr
|
||||
: '=' expr { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
// structs
|
||||
item_struct
|
||||
: STRUCT ident generic_params maybe_where_clause struct_decl_args
|
||||
{
|
||||
$$ = mk_node("ItemStruct", 4, $2, $3, $4, $5);
|
||||
}
|
||||
| STRUCT ident generic_params struct_tuple_args maybe_where_clause ';'
|
||||
{
|
||||
$$ = mk_node("ItemStruct", 4, $2, $3, $4, $5);
|
||||
}
|
||||
| STRUCT ident generic_params maybe_where_clause ';'
|
||||
{
|
||||
$$ = mk_node("ItemStruct", 3, $2, $3, $4);
|
||||
}
|
||||
;
|
||||
|
||||
struct_decl_args
|
||||
: '{' struct_decl_fields '}' { $$ = $2; }
|
||||
| '{' struct_decl_fields ',' '}' { $$ = $2; }
|
||||
;
|
||||
|
||||
struct_tuple_args
|
||||
: '(' struct_tuple_fields ')' { $$ = $2; }
|
||||
| '(' struct_tuple_fields ',' ')' { $$ = $2; }
|
||||
;
|
||||
|
||||
struct_decl_fields
|
||||
: struct_decl_field { $$ = mk_node("StructFields", 1, $1); }
|
||||
| struct_decl_fields ',' struct_decl_field { $$ = ext_node($1, 1, $3); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
struct_decl_field
|
||||
: attrs_and_vis ident ':' ty_sum { $$ = mk_node("StructField", 3, $1, $2, $4); }
|
||||
;
|
||||
|
||||
struct_tuple_fields
|
||||
: struct_tuple_field { $$ = mk_node("StructFields", 1, $1); }
|
||||
| struct_tuple_fields ',' struct_tuple_field { $$ = ext_node($1, 1, $3); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
struct_tuple_field
|
||||
: attrs_and_vis ty_sum { $$ = mk_node("StructField", 2, $1, $2); }
|
||||
;
|
||||
|
||||
// enums
|
||||
item_enum
|
||||
: ENUM ident generic_params maybe_where_clause '{' enum_defs '}' { $$ = mk_node("ItemEnum", 0); }
|
||||
| ENUM ident generic_params maybe_where_clause '{' enum_defs ',' '}' { $$ = mk_node("ItemEnum", 0); }
|
||||
;
|
||||
|
||||
enum_defs
|
||||
: enum_def { $$ = mk_node("EnumDefs", 1, $1); }
|
||||
| enum_defs ',' enum_def { $$ = ext_node($1, 1, $3); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
enum_def
|
||||
: attrs_and_vis ident enum_args { $$ = mk_node("EnumDef", 3, $1, $2, $3); }
|
||||
;
|
||||
|
||||
enum_args
|
||||
: '{' struct_decl_fields '}' { $$ = mk_node("EnumArgs", 1, $2); }
|
||||
| '{' struct_decl_fields ',' '}' { $$ = mk_node("EnumArgs", 1, $2); }
|
||||
| '(' maybe_ty_sums ')' { $$ = mk_node("EnumArgs", 1, $2); }
|
||||
| '=' expr { $$ = mk_node("EnumArgs", 1, $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
// unions
|
||||
item_union
|
||||
: UNION ident generic_params maybe_where_clause '{' struct_decl_fields '}' { $$ = mk_node("ItemUnion", 0); }
|
||||
| UNION ident generic_params maybe_where_clause '{' struct_decl_fields ',' '}' { $$ = mk_node("ItemUnion", 0); }
|
||||
|
||||
item_mod
|
||||
: MOD ident ';' { $$ = mk_node("ItemMod", 1, $2); }
|
||||
| MOD ident '{' maybe_mod_items '}' { $$ = mk_node("ItemMod", 2, $2, $4); }
|
||||
| MOD ident '{' inner_attrs maybe_mod_items '}' { $$ = mk_node("ItemMod", 3, $2, $4, $5); }
|
||||
;
|
||||
|
||||
item_foreign_mod
|
||||
: EXTERN maybe_abi '{' maybe_foreign_items '}' { $$ = mk_node("ItemForeignMod", 1, $4); }
|
||||
| EXTERN maybe_abi '{' inner_attrs maybe_foreign_items '}' { $$ = mk_node("ItemForeignMod", 2, $4, $5); }
|
||||
;
|
||||
|
||||
maybe_abi
|
||||
: str
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_foreign_items
|
||||
: foreign_items
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
foreign_items
|
||||
: foreign_item { $$ = mk_node("ForeignItems", 1, $1); }
|
||||
| foreign_items foreign_item { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
foreign_item
|
||||
: attrs_and_vis STATIC item_foreign_static { $$ = mk_node("ForeignItem", 2, $1, $3); }
|
||||
| attrs_and_vis item_foreign_fn { $$ = mk_node("ForeignItem", 2, $1, $2); }
|
||||
| attrs_and_vis UNSAFE item_foreign_fn { $$ = mk_node("ForeignItem", 2, $1, $3); }
|
||||
;
|
||||
|
||||
item_foreign_static
|
||||
: maybe_mut ident ':' ty ';' { $$ = mk_node("StaticItem", 3, $1, $2, $4); }
|
||||
;
|
||||
|
||||
item_foreign_fn
|
||||
: FN ident generic_params fn_decl_allow_variadic maybe_where_clause ';' { $$ = mk_node("ForeignFn", 4, $2, $3, $4, $5); }
|
||||
;
|
||||
|
||||
fn_decl_allow_variadic
|
||||
: fn_params_allow_variadic ret_ty { $$ = mk_node("FnDecl", 2, $1, $2); }
|
||||
;
|
||||
|
||||
fn_params_allow_variadic
|
||||
: '(' ')' { $$ = mk_none(); }
|
||||
| '(' params ')' { $$ = $2; }
|
||||
| '(' params ',' ')' { $$ = $2; }
|
||||
| '(' params ',' DOTDOTDOT ')' { $$ = $2; }
|
||||
;
|
||||
|
||||
visibility
|
||||
: PUB { $$ = mk_atom("Public"); }
|
||||
| %empty { $$ = mk_atom("Inherited"); }
|
||||
;
|
||||
|
||||
idents_or_self
|
||||
: ident_or_self { $$ = mk_node("IdentsOrSelf", 1, $1); }
|
||||
| idents_or_self AS ident { $$ = mk_node("IdentsOrSelf", 2, $1, $3); }
|
||||
| idents_or_self ',' ident_or_self { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
ident_or_self
|
||||
: ident
|
||||
| SELF { $$ = mk_atom(yytext); }
|
||||
;
|
||||
|
||||
item_type
|
||||
: TYPE ident generic_params maybe_where_clause '=' ty_sum ';' { $$ = mk_node("ItemTy", 4, $2, $3, $4, $6); }
|
||||
;
|
||||
|
||||
for_sized
|
||||
: FOR '?' ident { $$ = mk_node("ForSized", 1, $3); }
|
||||
| FOR ident '?' { $$ = mk_node("ForSized", 1, $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
item_trait
|
||||
: maybe_unsafe TRAIT ident generic_params for_sized maybe_ty_param_bounds maybe_where_clause '{' maybe_trait_items '}'
|
||||
{
|
||||
$$ = mk_node("ItemTrait", 7, $1, $3, $4, $5, $6, $7, $9);
|
||||
}
|
||||
;
|
||||
|
||||
maybe_trait_items
|
||||
: trait_items
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
trait_items
|
||||
: trait_item { $$ = mk_node("TraitItems", 1, $1); }
|
||||
| trait_items trait_item { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
trait_item
|
||||
: trait_const
|
||||
| trait_type
|
||||
| trait_method
|
||||
| maybe_outer_attrs item_macro { $$ = mk_node("TraitMacroItem", 2, $1, $2); }
|
||||
;
|
||||
|
||||
trait_const
|
||||
: maybe_outer_attrs CONST ident maybe_ty_ascription maybe_const_default ';' { $$ = mk_node("ConstTraitItem", 4, $1, $3, $4, $5); }
|
||||
;
|
||||
|
||||
maybe_const_default
|
||||
: '=' expr { $$ = mk_node("ConstDefault", 1, $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
trait_type
|
||||
: maybe_outer_attrs TYPE ty_param ';' { $$ = mk_node("TypeTraitItem", 2, $1, $3); }
|
||||
;
|
||||
|
||||
maybe_unsafe
|
||||
: UNSAFE { $$ = mk_atom("Unsafe"); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_default_maybe_unsafe
|
||||
: DEFAULT UNSAFE { $$ = mk_atom("DefaultUnsafe"); }
|
||||
| DEFAULT { $$ = mk_atom("Default"); }
|
||||
| UNSAFE { $$ = mk_atom("Unsafe"); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
|
||||
trait_method
|
||||
: type_method { $$ = mk_node("Required", 1, $1); }
|
||||
| method { $$ = mk_node("Provided", 1, $1); }
|
||||
;
|
||||
|
||||
type_method
|
||||
: maybe_outer_attrs maybe_unsafe FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause ';'
|
||||
{
|
||||
$$ = mk_node("TypeMethod", 6, $1, $2, $4, $5, $6, $7);
|
||||
}
|
||||
| maybe_outer_attrs CONST maybe_unsafe FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause ';'
|
||||
{
|
||||
$$ = mk_node("TypeMethod", 6, $1, $3, $5, $6, $7, $8);
|
||||
}
|
||||
| maybe_outer_attrs maybe_unsafe EXTERN maybe_abi FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause ';'
|
||||
{
|
||||
$$ = mk_node("TypeMethod", 7, $1, $2, $4, $6, $7, $8, $9);
|
||||
}
|
||||
;
|
||||
|
||||
method
|
||||
: maybe_outer_attrs maybe_unsafe FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 7, $1, $2, $4, $5, $6, $7, $8);
|
||||
}
|
||||
| maybe_outer_attrs CONST maybe_unsafe FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 7, $1, $3, $5, $6, $7, $8, $9);
|
||||
}
|
||||
| maybe_outer_attrs maybe_unsafe EXTERN maybe_abi FN ident generic_params fn_decl_with_self_allow_anon_params maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 8, $1, $2, $4, $6, $7, $8, $9, $10);
|
||||
}
|
||||
;
|
||||
|
||||
impl_method
|
||||
: attrs_and_vis maybe_default maybe_unsafe FN ident generic_params fn_decl_with_self maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 8, $1, $2, $3, $5, $6, $7, $8, $9);
|
||||
}
|
||||
| attrs_and_vis maybe_default CONST maybe_unsafe FN ident generic_params fn_decl_with_self maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 8, $1, $2, $4, $6, $7, $8, $9, $10);
|
||||
}
|
||||
| attrs_and_vis maybe_default maybe_unsafe EXTERN maybe_abi FN ident generic_params fn_decl_with_self maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("Method", 9, $1, $2, $3, $5, $7, $8, $9, $10, $11);
|
||||
}
|
||||
;
|
||||
|
||||
// There are two forms of impl:
|
||||
//
|
||||
// impl (<...>)? TY { ... }
|
||||
// impl (<...>)? TRAIT for TY { ... }
|
||||
//
|
||||
// Unfortunately since TY can begin with '<' itself -- as part of a
|
||||
// TyQualifiedPath type -- there's an s/r conflict when we see '<' after IMPL:
|
||||
// should we reduce one of the early rules of TY (such as maybe_once)
|
||||
// or shall we continue shifting into the generic_params list for the
|
||||
// impl?
|
||||
//
|
||||
// The production parser disambiguates a different case here by
|
||||
// permitting / requiring the user to provide parens around types when
|
||||
// they are ambiguous with traits. We do the same here, regrettably,
|
||||
// by splitting ty into ty and ty_prim.
|
||||
item_impl
|
||||
: maybe_default_maybe_unsafe IMPL generic_params ty_prim_sum maybe_where_clause '{' maybe_inner_attrs maybe_impl_items '}'
|
||||
{
|
||||
$$ = mk_node("ItemImpl", 6, $1, $3, $4, $5, $7, $8);
|
||||
}
|
||||
| maybe_default_maybe_unsafe IMPL generic_params '(' ty ')' maybe_where_clause '{' maybe_inner_attrs maybe_impl_items '}'
|
||||
{
|
||||
$$ = mk_node("ItemImpl", 6, $1, $3, 5, $6, $9, $10);
|
||||
}
|
||||
| maybe_default_maybe_unsafe IMPL generic_params trait_ref FOR ty_sum maybe_where_clause '{' maybe_inner_attrs maybe_impl_items '}'
|
||||
{
|
||||
$$ = mk_node("ItemImpl", 6, $3, $4, $6, $7, $9, $10);
|
||||
}
|
||||
| maybe_default_maybe_unsafe IMPL generic_params '!' trait_ref FOR ty_sum maybe_where_clause '{' maybe_inner_attrs maybe_impl_items '}'
|
||||
{
|
||||
$$ = mk_node("ItemImplNeg", 7, $1, $3, $5, $7, $8, $10, $11);
|
||||
}
|
||||
| maybe_default_maybe_unsafe IMPL generic_params trait_ref FOR DOTDOT '{' '}'
|
||||
{
|
||||
$$ = mk_node("ItemImplDefault", 3, $1, $3, $4);
|
||||
}
|
||||
| maybe_default_maybe_unsafe IMPL generic_params '!' trait_ref FOR DOTDOT '{' '}'
|
||||
{
|
||||
$$ = mk_node("ItemImplDefaultNeg", 3, $1, $3, $4);
|
||||
}
|
||||
;
|
||||
|
||||
maybe_impl_items
|
||||
: impl_items
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
impl_items
|
||||
: impl_item { $$ = mk_node("ImplItems", 1, $1); }
|
||||
| impl_item impl_items { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
impl_item
|
||||
: impl_method
|
||||
| attrs_and_vis item_macro { $$ = mk_node("ImplMacroItem", 2, $1, $2); }
|
||||
| impl_const
|
||||
| impl_type
|
||||
;
|
||||
|
||||
maybe_default
|
||||
: DEFAULT { $$ = mk_atom("Default"); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
impl_const
|
||||
: attrs_and_vis maybe_default item_const { $$ = mk_node("ImplConst", 3, $1, $2, $3); }
|
||||
;
|
||||
|
||||
impl_type
|
||||
: attrs_and_vis maybe_default TYPE ident generic_params '=' ty_sum ';' { $$ = mk_node("ImplType", 5, $1, $2, $4, $5, $7); }
|
||||
;
|
||||
|
||||
item_fn
|
||||
: FN ident generic_params fn_decl maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("ItemFn", 5, $2, $3, $4, $5, $6);
|
||||
}
|
||||
| CONST FN ident generic_params fn_decl maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("ItemFn", 5, $3, $4, $5, $6, $7);
|
||||
}
|
||||
;
|
||||
|
||||
item_unsafe_fn
|
||||
: UNSAFE FN ident generic_params fn_decl maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("ItemUnsafeFn", 5, $3, $4, $5, $6, $7);
|
||||
}
|
||||
| CONST UNSAFE FN ident generic_params fn_decl maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("ItemUnsafeFn", 5, $4, $5, $6, $7, $8);
|
||||
}
|
||||
| UNSAFE EXTERN maybe_abi FN ident generic_params fn_decl maybe_where_clause inner_attrs_and_block
|
||||
{
|
||||
$$ = mk_node("ItemUnsafeFn", 6, $3, $5, $6, $7, $8, $9);
|
||||
}
|
||||
;
|
||||
|
||||
fn_decl
|
||||
: fn_params ret_ty { $$ = mk_node("FnDecl", 2, $1, $2); }
|
||||
;
|
||||
|
||||
fn_decl_with_self
|
||||
: fn_params_with_self ret_ty { $$ = mk_node("FnDecl", 2, $1, $2); }
|
||||
;
|
||||
|
||||
fn_decl_with_self_allow_anon_params
|
||||
: fn_anon_params_with_self ret_ty { $$ = mk_node("FnDecl", 2, $1, $2); }
|
||||
;
|
||||
|
||||
fn_params
|
||||
: '(' maybe_params ')' { $$ = $2; }
|
||||
;
|
||||
|
||||
fn_anon_params
|
||||
: '(' anon_param anon_params_allow_variadic_tail ')' { $$ = ext_node($2, 1, $3); }
|
||||
| '(' ')' { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
fn_params_with_self
|
||||
: '(' maybe_mut SELF maybe_ty_ascription maybe_comma_params ')' { $$ = mk_node("SelfLower", 3, $2, $4, $5); }
|
||||
| '(' '&' maybe_mut SELF maybe_ty_ascription maybe_comma_params ')' { $$ = mk_node("SelfRegion", 3, $3, $5, $6); }
|
||||
| '(' '&' lifetime maybe_mut SELF maybe_ty_ascription maybe_comma_params ')' { $$ = mk_node("SelfRegion", 4, $3, $4, $6, $7); }
|
||||
| '(' maybe_params ')' { $$ = mk_node("SelfStatic", 1, $2); }
|
||||
;
|
||||
|
||||
fn_anon_params_with_self
|
||||
: '(' maybe_mut SELF maybe_ty_ascription maybe_comma_anon_params ')' { $$ = mk_node("SelfLower", 3, $2, $4, $5); }
|
||||
| '(' '&' maybe_mut SELF maybe_ty_ascription maybe_comma_anon_params ')' { $$ = mk_node("SelfRegion", 3, $3, $5, $6); }
|
||||
| '(' '&' lifetime maybe_mut SELF maybe_ty_ascription maybe_comma_anon_params ')' { $$ = mk_node("SelfRegion", 4, $3, $4, $6, $7); }
|
||||
| '(' maybe_anon_params ')' { $$ = mk_node("SelfStatic", 1, $2); }
|
||||
;
|
||||
|
||||
maybe_params
|
||||
: params
|
||||
| params ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
params
|
||||
: param { $$ = mk_node("Args", 1, $1); }
|
||||
| params ',' param { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
param
|
||||
: pat ':' ty_sum { $$ = mk_node("Arg", 2, $1, $3); }
|
||||
;
|
||||
|
||||
inferrable_params
|
||||
: inferrable_param { $$ = mk_node("InferrableParams", 1, $1); }
|
||||
| inferrable_params ',' inferrable_param { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
inferrable_param
|
||||
: pat maybe_ty_ascription { $$ = mk_node("InferrableParam", 2, $1, $2); }
|
||||
;
|
||||
|
||||
maybe_comma_params
|
||||
: ',' { $$ = mk_none(); }
|
||||
| ',' params { $$ = $2; }
|
||||
| ',' params ',' { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_comma_anon_params
|
||||
: ',' { $$ = mk_none(); }
|
||||
| ',' anon_params { $$ = $2; }
|
||||
| ',' anon_params ',' { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_anon_params
|
||||
: anon_params
|
||||
| anon_params ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
anon_params
|
||||
: anon_param { $$ = mk_node("Args", 1, $1); }
|
||||
| anon_params ',' anon_param { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
// anon means it's allowed to be anonymous (type-only), but it can
|
||||
// still have a name
|
||||
anon_param
|
||||
: named_arg ':' ty { $$ = mk_node("Arg", 2, $1, $3); }
|
||||
| ty
|
||||
;
|
||||
|
||||
anon_params_allow_variadic_tail
|
||||
: ',' DOTDOTDOT { $$ = mk_none(); }
|
||||
| ',' anon_param anon_params_allow_variadic_tail { $$ = mk_node("Args", 2, $2, $3); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
named_arg
|
||||
: ident
|
||||
| UNDERSCORE { $$ = mk_atom("PatWild"); }
|
||||
| '&' ident { $$ = $2; }
|
||||
| '&' UNDERSCORE { $$ = mk_atom("PatWild"); }
|
||||
| ANDAND ident { $$ = $2; }
|
||||
| ANDAND UNDERSCORE { $$ = mk_atom("PatWild"); }
|
||||
| MUT ident { $$ = $2; }
|
||||
;
|
||||
|
||||
ret_ty
|
||||
: RARROW '!' { $$ = mk_none(); }
|
||||
| RARROW ty { $$ = mk_node("ret-ty", 1, $2); }
|
||||
| %prec IDENT %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
generic_params
|
||||
: '<' '>' { $$ = mk_node("Generics", 2, mk_none(), mk_none()); }
|
||||
| '<' lifetimes '>' { $$ = mk_node("Generics", 2, $2, mk_none()); }
|
||||
| '<' lifetimes ',' '>' { $$ = mk_node("Generics", 2, $2, mk_none()); }
|
||||
| '<' lifetimes SHR { push_back('>'); $$ = mk_node("Generics", 2, $2, mk_none()); }
|
||||
| '<' lifetimes ',' SHR { push_back('>'); $$ = mk_node("Generics", 2, $2, mk_none()); }
|
||||
| '<' lifetimes ',' ty_params '>' { $$ = mk_node("Generics", 2, $2, $4); }
|
||||
| '<' lifetimes ',' ty_params ',' '>' { $$ = mk_node("Generics", 2, $2, $4); }
|
||||
| '<' lifetimes ',' ty_params SHR { push_back('>'); $$ = mk_node("Generics", 2, $2, $4); }
|
||||
| '<' lifetimes ',' ty_params ',' SHR { push_back('>'); $$ = mk_node("Generics", 2, $2, $4); }
|
||||
| '<' ty_params '>' { $$ = mk_node("Generics", 2, mk_none(), $2); }
|
||||
| '<' ty_params ',' '>' { $$ = mk_node("Generics", 2, mk_none(), $2); }
|
||||
| '<' ty_params SHR { push_back('>'); $$ = mk_node("Generics", 2, mk_none(), $2); }
|
||||
| '<' ty_params ',' SHR { push_back('>'); $$ = mk_node("Generics", 2, mk_none(), $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_where_clause
|
||||
: %empty { $$ = mk_none(); }
|
||||
| where_clause
|
||||
;
|
||||
|
||||
where_clause
|
||||
: WHERE where_predicates { $$ = mk_node("WhereClause", 1, $2); }
|
||||
| WHERE where_predicates ',' { $$ = mk_node("WhereClause", 1, $2); }
|
||||
;
|
||||
|
||||
where_predicates
|
||||
: where_predicate { $$ = mk_node("WherePredicates", 1, $1); }
|
||||
| where_predicates ',' where_predicate { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
where_predicate
|
||||
: maybe_for_lifetimes lifetime ':' bounds { $$ = mk_node("WherePredicate", 3, $1, $2, $4); }
|
||||
| maybe_for_lifetimes ty ':' ty_param_bounds { $$ = mk_node("WherePredicate", 3, $1, $2, $4); }
|
||||
;
|
||||
|
||||
maybe_for_lifetimes
|
||||
: FOR '<' lifetimes '>' { $$ = mk_none(); }
|
||||
| %prec FORTYPE %empty { $$ = mk_none(); }
|
||||
|
||||
ty_params
|
||||
: ty_param { $$ = mk_node("TyParams", 1, $1); }
|
||||
| ty_params ',' ty_param { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
// A path with no type parameters; e.g. `foo::bar::Baz`
|
||||
//
|
||||
// These show up in 'use' view-items, because these are processed
|
||||
// without respect to types.
|
||||
path_no_types_allowed
|
||||
: ident { $$ = mk_node("ViewPath", 1, $1); }
|
||||
| MOD_SEP ident { $$ = mk_node("ViewPath", 1, $2); }
|
||||
| SELF { $$ = mk_node("ViewPath", 1, mk_atom("Self")); }
|
||||
| MOD_SEP SELF { $$ = mk_node("ViewPath", 1, mk_atom("Self")); }
|
||||
| SUPER { $$ = mk_node("ViewPath", 1, mk_atom("Super")); }
|
||||
| MOD_SEP SUPER { $$ = mk_node("ViewPath", 1, mk_atom("Super")); }
|
||||
| path_no_types_allowed MOD_SEP ident { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
// A path with a lifetime and type parameters, with no double colons
|
||||
// before the type parameters; e.g. `foo::bar<'a>::Baz<T>`
|
||||
//
|
||||
// These show up in "trait references", the components of
|
||||
// type-parameter bounds lists, as well as in the prefix of the
|
||||
// path_generic_args_and_bounds rule, which is the full form of a
|
||||
// named typed expression.
|
||||
//
|
||||
// They do not have (nor need) an extra '::' before '<' because
|
||||
// unlike in expr context, there are no "less-than" type exprs to
|
||||
// be ambiguous with.
|
||||
path_generic_args_without_colons
|
||||
: %prec IDENT
|
||||
ident { $$ = mk_node("components", 1, $1); }
|
||||
| %prec IDENT
|
||||
ident generic_args { $$ = mk_node("components", 2, $1, $2); }
|
||||
| %prec IDENT
|
||||
ident '(' maybe_ty_sums ')' ret_ty { $$ = mk_node("components", 2, $1, $3); }
|
||||
| %prec IDENT
|
||||
path_generic_args_without_colons MOD_SEP ident { $$ = ext_node($1, 1, $3); }
|
||||
| %prec IDENT
|
||||
path_generic_args_without_colons MOD_SEP ident generic_args { $$ = ext_node($1, 2, $3, $4); }
|
||||
| %prec IDENT
|
||||
path_generic_args_without_colons MOD_SEP ident '(' maybe_ty_sums ')' ret_ty { $$ = ext_node($1, 2, $3, $5); }
|
||||
;
|
||||
|
||||
generic_args
|
||||
: '<' generic_values '>' { $$ = $2; }
|
||||
| '<' generic_values SHR { push_back('>'); $$ = $2; }
|
||||
| '<' generic_values GE { push_back('='); $$ = $2; }
|
||||
| '<' generic_values SHREQ { push_back('>'); push_back('='); $$ = $2; }
|
||||
// If generic_args starts with "<<", the first arg must be a
|
||||
// TyQualifiedPath because that's the only type that can start with a
|
||||
// '<'. This rule parses that as the first ty_sum and then continues
|
||||
// with the rest of generic_values.
|
||||
| SHL ty_qualified_path_and_generic_values '>' { $$ = $2; }
|
||||
| SHL ty_qualified_path_and_generic_values SHR { push_back('>'); $$ = $2; }
|
||||
| SHL ty_qualified_path_and_generic_values GE { push_back('='); $$ = $2; }
|
||||
| SHL ty_qualified_path_and_generic_values SHREQ { push_back('>'); push_back('='); $$ = $2; }
|
||||
;
|
||||
|
||||
generic_values
|
||||
: maybe_ty_sums_and_or_bindings { $$ = mk_node("GenericValues", 1, $1); }
|
||||
;
|
||||
|
||||
maybe_ty_sums_and_or_bindings
|
||||
: ty_sums
|
||||
| ty_sums ','
|
||||
| ty_sums ',' bindings { $$ = mk_node("TySumsAndBindings", 2, $1, $3); }
|
||||
| bindings
|
||||
| bindings ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_bindings
|
||||
: ',' bindings { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
// Part 2: Patterns
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
|
||||
pat
|
||||
: UNDERSCORE { $$ = mk_atom("PatWild"); }
|
||||
| '&' pat { $$ = mk_node("PatRegion", 1, $2); }
|
||||
| '&' MUT pat { $$ = mk_node("PatRegion", 1, $3); }
|
||||
| ANDAND pat { $$ = mk_node("PatRegion", 1, mk_node("PatRegion", 1, $2)); }
|
||||
| '(' ')' { $$ = mk_atom("PatUnit"); }
|
||||
| '(' pat_tup ')' { $$ = mk_node("PatTup", 1, $2); }
|
||||
| '[' pat_vec ']' { $$ = mk_node("PatVec", 1, $2); }
|
||||
| lit_or_path
|
||||
| lit_or_path DOTDOTDOT lit_or_path { $$ = mk_node("PatRange", 2, $1, $3); }
|
||||
| path_expr '{' pat_struct '}' { $$ = mk_node("PatStruct", 2, $1, $3); }
|
||||
| path_expr '(' ')' { $$ = mk_node("PatEnum", 2, $1, mk_none()); }
|
||||
| path_expr '(' pat_tup ')' { $$ = mk_node("PatEnum", 2, $1, $3); }
|
||||
| path_expr '!' maybe_ident delimited_token_trees { $$ = mk_node("PatMac", 3, $1, $3, $4); }
|
||||
| binding_mode ident { $$ = mk_node("PatIdent", 2, $1, $2); }
|
||||
| ident '@' pat { $$ = mk_node("PatIdent", 3, mk_node("BindByValue", 1, mk_atom("MutImmutable")), $1, $3); }
|
||||
| binding_mode ident '@' pat { $$ = mk_node("PatIdent", 3, $1, $2, $4); }
|
||||
| BOX pat { $$ = mk_node("PatUniq", 1, $2); }
|
||||
| '<' ty_sum maybe_as_trait_ref '>' MOD_SEP ident { $$ = mk_node("PatQualifiedPath", 3, $2, $3, $6); }
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident maybe_as_trait_ref '>' MOD_SEP ident
|
||||
{
|
||||
$$ = mk_node("PatQualifiedPath", 3, mk_node("PatQualifiedPath", 3, $2, $3, $6), $7, $10);
|
||||
}
|
||||
;
|
||||
|
||||
pats_or
|
||||
: pat { $$ = mk_node("Pats", 1, $1); }
|
||||
| pats_or '|' pat { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
binding_mode
|
||||
: REF { $$ = mk_node("BindByRef", 1, mk_atom("MutImmutable")); }
|
||||
| REF MUT { $$ = mk_node("BindByRef", 1, mk_atom("MutMutable")); }
|
||||
| MUT { $$ = mk_node("BindByValue", 1, mk_atom("MutMutable")); }
|
||||
;
|
||||
|
||||
lit_or_path
|
||||
: path_expr { $$ = mk_node("PatLit", 1, $1); }
|
||||
| lit { $$ = mk_node("PatLit", 1, $1); }
|
||||
| '-' lit { $$ = mk_node("PatLit", 1, $2); }
|
||||
;
|
||||
|
||||
pat_field
|
||||
: ident { $$ = mk_node("PatField", 1, $1); }
|
||||
| binding_mode ident { $$ = mk_node("PatField", 2, $1, $2); }
|
||||
| BOX ident { $$ = mk_node("PatField", 2, mk_atom("box"), $2); }
|
||||
| BOX binding_mode ident { $$ = mk_node("PatField", 3, mk_atom("box"), $2, $3); }
|
||||
| ident ':' pat { $$ = mk_node("PatField", 2, $1, $3); }
|
||||
| binding_mode ident ':' pat { $$ = mk_node("PatField", 3, $1, $2, $4); }
|
||||
| LIT_INTEGER ':' pat { $$ = mk_node("PatField", 2, mk_atom(yytext), $3); }
|
||||
;
|
||||
|
||||
pat_fields
|
||||
: pat_field { $$ = mk_node("PatFields", 1, $1); }
|
||||
| pat_fields ',' pat_field { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
pat_struct
|
||||
: pat_fields { $$ = mk_node("PatStruct", 2, $1, mk_atom("false")); }
|
||||
| pat_fields ',' { $$ = mk_node("PatStruct", 2, $1, mk_atom("false")); }
|
||||
| pat_fields ',' DOTDOT { $$ = mk_node("PatStruct", 2, $1, mk_atom("true")); }
|
||||
| DOTDOT { $$ = mk_node("PatStruct", 1, mk_atom("true")); }
|
||||
| %empty { $$ = mk_node("PatStruct", 1, mk_none()); }
|
||||
;
|
||||
|
||||
pat_tup
|
||||
: pat_tup_elts { $$ = mk_node("PatTup", 2, $1, mk_none()); }
|
||||
| pat_tup_elts ',' { $$ = mk_node("PatTup", 2, $1, mk_none()); }
|
||||
| pat_tup_elts DOTDOT { $$ = mk_node("PatTup", 2, $1, mk_none()); }
|
||||
| pat_tup_elts ',' DOTDOT { $$ = mk_node("PatTup", 2, $1, mk_none()); }
|
||||
| pat_tup_elts DOTDOT ',' pat_tup_elts { $$ = mk_node("PatTup", 2, $1, $4); }
|
||||
| pat_tup_elts DOTDOT ',' pat_tup_elts ',' { $$ = mk_node("PatTup", 2, $1, $4); }
|
||||
| pat_tup_elts ',' DOTDOT ',' pat_tup_elts { $$ = mk_node("PatTup", 2, $1, $5); }
|
||||
| pat_tup_elts ',' DOTDOT ',' pat_tup_elts ',' { $$ = mk_node("PatTup", 2, $1, $5); }
|
||||
| DOTDOT ',' pat_tup_elts { $$ = mk_node("PatTup", 2, mk_none(), $3); }
|
||||
| DOTDOT ',' pat_tup_elts ',' { $$ = mk_node("PatTup", 2, mk_none(), $3); }
|
||||
| DOTDOT { $$ = mk_node("PatTup", 2, mk_none(), mk_none()); }
|
||||
;
|
||||
|
||||
pat_tup_elts
|
||||
: pat { $$ = mk_node("PatTupElts", 1, $1); }
|
||||
| pat_tup_elts ',' pat { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
pat_vec
|
||||
: pat_vec_elts { $$ = mk_node("PatVec", 2, $1, mk_none()); }
|
||||
| pat_vec_elts ',' { $$ = mk_node("PatVec", 2, $1, mk_none()); }
|
||||
| pat_vec_elts DOTDOT { $$ = mk_node("PatVec", 2, $1, mk_none()); }
|
||||
| pat_vec_elts ',' DOTDOT { $$ = mk_node("PatVec", 2, $1, mk_none()); }
|
||||
| pat_vec_elts DOTDOT ',' pat_vec_elts { $$ = mk_node("PatVec", 2, $1, $4); }
|
||||
| pat_vec_elts DOTDOT ',' pat_vec_elts ',' { $$ = mk_node("PatVec", 2, $1, $4); }
|
||||
| pat_vec_elts ',' DOTDOT ',' pat_vec_elts { $$ = mk_node("PatVec", 2, $1, $5); }
|
||||
| pat_vec_elts ',' DOTDOT ',' pat_vec_elts ',' { $$ = mk_node("PatVec", 2, $1, $5); }
|
||||
| DOTDOT ',' pat_vec_elts { $$ = mk_node("PatVec", 2, mk_none(), $3); }
|
||||
| DOTDOT ',' pat_vec_elts ',' { $$ = mk_node("PatVec", 2, mk_none(), $3); }
|
||||
| DOTDOT { $$ = mk_node("PatVec", 2, mk_none(), mk_none()); }
|
||||
| %empty { $$ = mk_node("PatVec", 2, mk_none(), mk_none()); }
|
||||
;
|
||||
|
||||
pat_vec_elts
|
||||
: pat { $$ = mk_node("PatVecElts", 1, $1); }
|
||||
| pat_vec_elts ',' pat { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
// Part 3: Types
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
|
||||
ty
|
||||
: ty_prim
|
||||
| ty_closure
|
||||
| '<' ty_sum maybe_as_trait_ref '>' MOD_SEP ident { $$ = mk_node("TyQualifiedPath", 3, $2, $3, $6); }
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident maybe_as_trait_ref '>' MOD_SEP ident { $$ = mk_node("TyQualifiedPath", 3, mk_node("TyQualifiedPath", 3, $2, $3, $6), $7, $10); }
|
||||
| '(' ty_sums ')' { $$ = mk_node("TyTup", 1, $2); }
|
||||
| '(' ty_sums ',' ')' { $$ = mk_node("TyTup", 1, $2); }
|
||||
| '(' ')' { $$ = mk_atom("TyNil"); }
|
||||
;
|
||||
|
||||
ty_prim
|
||||
: %prec IDENT path_generic_args_without_colons { $$ = mk_node("TyPath", 2, mk_node("global", 1, mk_atom("false")), $1); }
|
||||
| %prec IDENT MOD_SEP path_generic_args_without_colons { $$ = mk_node("TyPath", 2, mk_node("global", 1, mk_atom("true")), $2); }
|
||||
| %prec IDENT SELF MOD_SEP path_generic_args_without_colons { $$ = mk_node("TyPath", 2, mk_node("self", 1, mk_atom("true")), $3); }
|
||||
| %prec IDENT path_generic_args_without_colons '!' maybe_ident delimited_token_trees { $$ = mk_node("TyMacro", 3, $1, $3, $4); }
|
||||
| %prec IDENT MOD_SEP path_generic_args_without_colons '!' maybe_ident delimited_token_trees { $$ = mk_node("TyMacro", 3, $2, $4, $5); }
|
||||
| BOX ty { $$ = mk_node("TyBox", 1, $2); }
|
||||
| '*' maybe_mut_or_const ty { $$ = mk_node("TyPtr", 2, $2, $3); }
|
||||
| '&' ty { $$ = mk_node("TyRptr", 2, mk_atom("MutImmutable"), $2); }
|
||||
| '&' MUT ty { $$ = mk_node("TyRptr", 2, mk_atom("MutMutable"), $3); }
|
||||
| ANDAND ty { $$ = mk_node("TyRptr", 1, mk_node("TyRptr", 2, mk_atom("MutImmutable"), $2)); }
|
||||
| ANDAND MUT ty { $$ = mk_node("TyRptr", 1, mk_node("TyRptr", 2, mk_atom("MutMutable"), $3)); }
|
||||
| '&' lifetime maybe_mut ty { $$ = mk_node("TyRptr", 3, $2, $3, $4); }
|
||||
| ANDAND lifetime maybe_mut ty { $$ = mk_node("TyRptr", 1, mk_node("TyRptr", 3, $2, $3, $4)); }
|
||||
| '[' ty ']' { $$ = mk_node("TyVec", 1, $2); }
|
||||
| '[' ty ',' DOTDOT expr ']' { $$ = mk_node("TyFixedLengthVec", 2, $2, $5); }
|
||||
| '[' ty ';' expr ']' { $$ = mk_node("TyFixedLengthVec", 2, $2, $4); }
|
||||
| TYPEOF '(' expr ')' { $$ = mk_node("TyTypeof", 1, $3); }
|
||||
| UNDERSCORE { $$ = mk_atom("TyInfer"); }
|
||||
| ty_bare_fn
|
||||
| for_in_type
|
||||
;
|
||||
|
||||
ty_bare_fn
|
||||
: FN ty_fn_decl { $$ = $2; }
|
||||
| UNSAFE FN ty_fn_decl { $$ = $3; }
|
||||
| EXTERN maybe_abi FN ty_fn_decl { $$ = $4; }
|
||||
| UNSAFE EXTERN maybe_abi FN ty_fn_decl { $$ = $5; }
|
||||
;
|
||||
|
||||
ty_fn_decl
|
||||
: generic_params fn_anon_params ret_ty { $$ = mk_node("TyFnDecl", 3, $1, $2, $3); }
|
||||
;
|
||||
|
||||
ty_closure
|
||||
: UNSAFE '|' anon_params '|' maybe_bounds ret_ty { $$ = mk_node("TyClosure", 3, $3, $5, $6); }
|
||||
| '|' anon_params '|' maybe_bounds ret_ty { $$ = mk_node("TyClosure", 3, $2, $4, $5); }
|
||||
| UNSAFE OROR maybe_bounds ret_ty { $$ = mk_node("TyClosure", 2, $3, $4); }
|
||||
| OROR maybe_bounds ret_ty { $$ = mk_node("TyClosure", 2, $2, $3); }
|
||||
;
|
||||
|
||||
for_in_type
|
||||
: FOR '<' maybe_lifetimes '>' for_in_type_suffix { $$ = mk_node("ForInType", 2, $3, $5); }
|
||||
;
|
||||
|
||||
for_in_type_suffix
|
||||
: ty_bare_fn
|
||||
| trait_ref
|
||||
| ty_closure
|
||||
;
|
||||
|
||||
maybe_mut
|
||||
: MUT { $$ = mk_atom("MutMutable"); }
|
||||
| %prec MUT %empty { $$ = mk_atom("MutImmutable"); }
|
||||
;
|
||||
|
||||
maybe_mut_or_const
|
||||
: MUT { $$ = mk_atom("MutMutable"); }
|
||||
| CONST { $$ = mk_atom("MutImmutable"); }
|
||||
| %empty { $$ = mk_atom("MutImmutable"); }
|
||||
;
|
||||
|
||||
ty_qualified_path_and_generic_values
|
||||
: ty_qualified_path maybe_bindings
|
||||
{
|
||||
$$ = mk_node("GenericValues", 3, mk_none(), mk_node("TySums", 1, mk_node("TySum", 1, $1)), $2);
|
||||
}
|
||||
| ty_qualified_path ',' ty_sums maybe_bindings
|
||||
{
|
||||
$$ = mk_node("GenericValues", 3, mk_none(), mk_node("TySums", 2, $1, $3), $4);
|
||||
}
|
||||
;
|
||||
|
||||
ty_qualified_path
|
||||
: ty_sum AS trait_ref '>' MOD_SEP ident { $$ = mk_node("TyQualifiedPath", 3, $1, $3, $6); }
|
||||
| ty_sum AS trait_ref '>' MOD_SEP ident '+' ty_param_bounds { $$ = mk_node("TyQualifiedPath", 3, $1, $3, $6); }
|
||||
;
|
||||
|
||||
maybe_ty_sums
|
||||
: ty_sums
|
||||
| ty_sums ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
ty_sums
|
||||
: ty_sum { $$ = mk_node("TySums", 1, $1); }
|
||||
| ty_sums ',' ty_sum { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
ty_sum
|
||||
: ty_sum_elt { $$ = mk_node("TySum", 1, $1); }
|
||||
| ty_sum '+' ty_sum_elt { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
ty_sum_elt
|
||||
: ty
|
||||
| lifetime
|
||||
;
|
||||
|
||||
ty_prim_sum
|
||||
: ty_prim_sum_elt { $$ = mk_node("TySum", 1, $1); }
|
||||
| ty_prim_sum '+' ty_prim_sum_elt { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
ty_prim_sum_elt
|
||||
: ty_prim
|
||||
| lifetime
|
||||
;
|
||||
|
||||
maybe_ty_param_bounds
|
||||
: ':' ty_param_bounds { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
ty_param_bounds
|
||||
: boundseq
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
boundseq
|
||||
: polybound
|
||||
| boundseq '+' polybound { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
polybound
|
||||
: FOR '<' maybe_lifetimes '>' bound { $$ = mk_node("PolyBound", 2, $3, $5); }
|
||||
| bound
|
||||
| '?' FOR '<' maybe_lifetimes '>' bound { $$ = mk_node("PolyBound", 2, $4, $6); }
|
||||
| '?' bound { $$ = $2; }
|
||||
;
|
||||
|
||||
bindings
|
||||
: binding { $$ = mk_node("Bindings", 1, $1); }
|
||||
| bindings ',' binding { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
binding
|
||||
: ident '=' ty { mk_node("Binding", 2, $1, $3); }
|
||||
;
|
||||
|
||||
ty_param
|
||||
: ident maybe_ty_param_bounds maybe_ty_default { $$ = mk_node("TyParam", 3, $1, $2, $3); }
|
||||
| ident '?' ident maybe_ty_param_bounds maybe_ty_default { $$ = mk_node("TyParam", 4, $1, $3, $4, $5); }
|
||||
;
|
||||
|
||||
maybe_bounds
|
||||
: %prec SHIFTPLUS
|
||||
':' bounds { $$ = $2; }
|
||||
| %prec SHIFTPLUS %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
bounds
|
||||
: bound { $$ = mk_node("bounds", 1, $1); }
|
||||
| bounds '+' bound { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
bound
|
||||
: lifetime
|
||||
| trait_ref
|
||||
;
|
||||
|
||||
maybe_ltbounds
|
||||
: %prec SHIFTPLUS
|
||||
':' ltbounds { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
ltbounds
|
||||
: lifetime { $$ = mk_node("ltbounds", 1, $1); }
|
||||
| ltbounds '+' lifetime { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
maybe_ty_default
|
||||
: '=' ty_sum { $$ = mk_node("TyDefault", 1, $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_lifetimes
|
||||
: lifetimes
|
||||
| lifetimes ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
lifetimes
|
||||
: lifetime_and_bounds { $$ = mk_node("Lifetimes", 1, $1); }
|
||||
| lifetimes ',' lifetime_and_bounds { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
lifetime_and_bounds
|
||||
: LIFETIME maybe_ltbounds { $$ = mk_node("lifetime", 2, mk_atom(yytext), $2); }
|
||||
| STATIC_LIFETIME { $$ = mk_atom("static_lifetime"); }
|
||||
;
|
||||
|
||||
lifetime
|
||||
: LIFETIME { $$ = mk_node("lifetime", 1, mk_atom(yytext)); }
|
||||
| STATIC_LIFETIME { $$ = mk_atom("static_lifetime"); }
|
||||
;
|
||||
|
||||
trait_ref
|
||||
: %prec IDENT path_generic_args_without_colons
|
||||
| %prec IDENT MOD_SEP path_generic_args_without_colons { $$ = $2; }
|
||||
;
|
||||
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
// Part 4: Blocks, statements, and expressions
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
|
||||
inner_attrs_and_block
|
||||
: '{' maybe_inner_attrs maybe_stmts '}' { $$ = mk_node("ExprBlock", 2, $2, $3); }
|
||||
;
|
||||
|
||||
block
|
||||
: '{' maybe_stmts '}' { $$ = mk_node("ExprBlock", 1, $2); }
|
||||
;
|
||||
|
||||
maybe_stmts
|
||||
: stmts
|
||||
| stmts nonblock_expr { $$ = ext_node($1, 1, $2); }
|
||||
| nonblock_expr
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
// There are two sub-grammars within a "stmts: exprs" derivation
|
||||
// depending on whether each stmt-expr is a block-expr form; this is to
|
||||
// handle the "semicolon rule" for stmt sequencing that permits
|
||||
// writing
|
||||
//
|
||||
// if foo { bar } 10
|
||||
//
|
||||
// as a sequence of two stmts (one if-expr stmt, one lit-10-expr
|
||||
// stmt). Unfortunately by permitting juxtaposition of exprs in
|
||||
// sequence like that, the non-block expr grammar has to have a
|
||||
// second limited sub-grammar that excludes the prefix exprs that
|
||||
// are ambiguous with binops. That is to say:
|
||||
//
|
||||
// {10} - 1
|
||||
//
|
||||
// should parse as (progn (progn 10) (- 1)) not (- (progn 10) 1), that
|
||||
// is to say, two statements rather than one, at least according to
|
||||
// the mainline rust parser.
|
||||
//
|
||||
// So we wind up with a 3-way split in exprs that occur in stmt lists:
|
||||
// block, nonblock-prefix, and nonblock-nonprefix.
|
||||
//
|
||||
// In non-stmts contexts, expr can relax this trichotomy.
|
||||
|
||||
stmts
|
||||
: stmt { $$ = mk_node("stmts", 1, $1); }
|
||||
| stmts stmt { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
stmt
|
||||
: maybe_outer_attrs let { $$ = $2; }
|
||||
| stmt_item
|
||||
| PUB stmt_item { $$ = $2; }
|
||||
| outer_attrs stmt_item { $$ = $2; }
|
||||
| outer_attrs PUB stmt_item { $$ = $3; }
|
||||
| full_block_expr
|
||||
| maybe_outer_attrs block { $$ = $2; }
|
||||
| nonblock_expr ';'
|
||||
| outer_attrs nonblock_expr ';' { $$ = $2; }
|
||||
| ';' { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_exprs
|
||||
: exprs
|
||||
| exprs ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_expr
|
||||
: expr
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
exprs
|
||||
: expr { $$ = mk_node("exprs", 1, $1); }
|
||||
| exprs ',' expr { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
path_expr
|
||||
: path_generic_args_with_colons
|
||||
| MOD_SEP path_generic_args_with_colons { $$ = $2; }
|
||||
| SELF MOD_SEP path_generic_args_with_colons { $$ = mk_node("SelfPath", 1, $3); }
|
||||
;
|
||||
|
||||
// A path with a lifetime and type parameters with double colons before
|
||||
// the type parameters; e.g. `foo::bar::<'a>::Baz::<T>`
|
||||
//
|
||||
// These show up in expr context, in order to disambiguate from "less-than"
|
||||
// expressions.
|
||||
path_generic_args_with_colons
|
||||
: ident { $$ = mk_node("components", 1, $1); }
|
||||
| SUPER { $$ = mk_atom("Super"); }
|
||||
| path_generic_args_with_colons MOD_SEP ident { $$ = ext_node($1, 1, $3); }
|
||||
| path_generic_args_with_colons MOD_SEP SUPER { $$ = ext_node($1, 1, mk_atom("Super")); }
|
||||
| path_generic_args_with_colons MOD_SEP generic_args { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
// the braces-delimited macro is a block_expr so it doesn't appear here
|
||||
macro_expr
|
||||
: path_expr '!' maybe_ident parens_delimited_token_trees { $$ = mk_node("MacroExpr", 3, $1, $3, $4); }
|
||||
| path_expr '!' maybe_ident brackets_delimited_token_trees { $$ = mk_node("MacroExpr", 3, $1, $3, $4); }
|
||||
;
|
||||
|
||||
nonblock_expr
|
||||
: lit { $$ = mk_node("ExprLit", 1, $1); }
|
||||
| %prec IDENT
|
||||
path_expr { $$ = mk_node("ExprPath", 1, $1); }
|
||||
| SELF { $$ = mk_node("ExprPath", 1, mk_node("ident", 1, mk_atom("self"))); }
|
||||
| macro_expr { $$ = mk_node("ExprMac", 1, $1); }
|
||||
| path_expr '{' struct_expr_fields '}' { $$ = mk_node("ExprStruct", 2, $1, $3); }
|
||||
| nonblock_expr '?' { $$ = mk_node("ExprTry", 1, $1); }
|
||||
| nonblock_expr '.' path_generic_args_with_colons { $$ = mk_node("ExprField", 2, $1, $3); }
|
||||
| nonblock_expr '.' LIT_INTEGER { $$ = mk_node("ExprTupleIndex", 1, $1); }
|
||||
| nonblock_expr '[' maybe_expr ']' { $$ = mk_node("ExprIndex", 2, $1, $3); }
|
||||
| nonblock_expr '(' maybe_exprs ')' { $$ = mk_node("ExprCall", 2, $1, $3); }
|
||||
| '[' vec_expr ']' { $$ = mk_node("ExprVec", 1, $2); }
|
||||
| '(' maybe_exprs ')' { $$ = mk_node("ExprParen", 1, $2); }
|
||||
| CONTINUE { $$ = mk_node("ExprAgain", 0); }
|
||||
| CONTINUE lifetime { $$ = mk_node("ExprAgain", 1, $2); }
|
||||
| RETURN { $$ = mk_node("ExprRet", 0); }
|
||||
| RETURN expr { $$ = mk_node("ExprRet", 1, $2); }
|
||||
| BREAK { $$ = mk_node("ExprBreak", 0); }
|
||||
| BREAK lifetime { $$ = mk_node("ExprBreak", 1, $2); }
|
||||
| YIELD { $$ = mk_node("ExprYield", 0); }
|
||||
| YIELD expr { $$ = mk_node("ExprYield", 1, $2); }
|
||||
| nonblock_expr '=' expr { $$ = mk_node("ExprAssign", 2, $1, $3); }
|
||||
| nonblock_expr SHLEQ expr { $$ = mk_node("ExprAssignShl", 2, $1, $3); }
|
||||
| nonblock_expr SHREQ expr { $$ = mk_node("ExprAssignShr", 2, $1, $3); }
|
||||
| nonblock_expr MINUSEQ expr { $$ = mk_node("ExprAssignSub", 2, $1, $3); }
|
||||
| nonblock_expr ANDEQ expr { $$ = mk_node("ExprAssignBitAnd", 2, $1, $3); }
|
||||
| nonblock_expr OREQ expr { $$ = mk_node("ExprAssignBitOr", 2, $1, $3); }
|
||||
| nonblock_expr PLUSEQ expr { $$ = mk_node("ExprAssignAdd", 2, $1, $3); }
|
||||
| nonblock_expr STAREQ expr { $$ = mk_node("ExprAssignMul", 2, $1, $3); }
|
||||
| nonblock_expr SLASHEQ expr { $$ = mk_node("ExprAssignDiv", 2, $1, $3); }
|
||||
| nonblock_expr CARETEQ expr { $$ = mk_node("ExprAssignBitXor", 2, $1, $3); }
|
||||
| nonblock_expr PERCENTEQ expr { $$ = mk_node("ExprAssignRem", 2, $1, $3); }
|
||||
| nonblock_expr OROR expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiOr"), $1, $3); }
|
||||
| nonblock_expr ANDAND expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiAnd"), $1, $3); }
|
||||
| nonblock_expr EQEQ expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiEq"), $1, $3); }
|
||||
| nonblock_expr NE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiNe"), $1, $3); }
|
||||
| nonblock_expr '<' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiLt"), $1, $3); }
|
||||
| nonblock_expr '>' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiGt"), $1, $3); }
|
||||
| nonblock_expr LE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiLe"), $1, $3); }
|
||||
| nonblock_expr GE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiGe"), $1, $3); }
|
||||
| nonblock_expr '|' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitOr"), $1, $3); }
|
||||
| nonblock_expr '^' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitXor"), $1, $3); }
|
||||
| nonblock_expr '&' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitAnd"), $1, $3); }
|
||||
| nonblock_expr SHL expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiShl"), $1, $3); }
|
||||
| nonblock_expr SHR expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiShr"), $1, $3); }
|
||||
| nonblock_expr '+' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiAdd"), $1, $3); }
|
||||
| nonblock_expr '-' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiSub"), $1, $3); }
|
||||
| nonblock_expr '*' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiMul"), $1, $3); }
|
||||
| nonblock_expr '/' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiDiv"), $1, $3); }
|
||||
| nonblock_expr '%' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiRem"), $1, $3); }
|
||||
| nonblock_expr DOTDOT { $$ = mk_node("ExprRange", 2, $1, mk_none()); }
|
||||
| nonblock_expr DOTDOT expr { $$ = mk_node("ExprRange", 2, $1, $3); }
|
||||
| DOTDOT expr { $$ = mk_node("ExprRange", 2, mk_none(), $2); }
|
||||
| DOTDOT { $$ = mk_node("ExprRange", 2, mk_none(), mk_none()); }
|
||||
| nonblock_expr AS ty { $$ = mk_node("ExprCast", 2, $1, $3); }
|
||||
| nonblock_expr ':' ty { $$ = mk_node("ExprTypeAscr", 2, $1, $3); }
|
||||
| BOX expr { $$ = mk_node("ExprBox", 1, $2); }
|
||||
| expr_qualified_path
|
||||
| nonblock_prefix_expr
|
||||
;
|
||||
|
||||
expr
|
||||
: lit { $$ = mk_node("ExprLit", 1, $1); }
|
||||
| %prec IDENT
|
||||
path_expr { $$ = mk_node("ExprPath", 1, $1); }
|
||||
| SELF { $$ = mk_node("ExprPath", 1, mk_node("ident", 1, mk_atom("self"))); }
|
||||
| macro_expr { $$ = mk_node("ExprMac", 1, $1); }
|
||||
| path_expr '{' struct_expr_fields '}' { $$ = mk_node("ExprStruct", 2, $1, $3); }
|
||||
| expr '?' { $$ = mk_node("ExprTry", 1, $1); }
|
||||
| expr '.' path_generic_args_with_colons { $$ = mk_node("ExprField", 2, $1, $3); }
|
||||
| expr '.' LIT_INTEGER { $$ = mk_node("ExprTupleIndex", 1, $1); }
|
||||
| expr '[' maybe_expr ']' { $$ = mk_node("ExprIndex", 2, $1, $3); }
|
||||
| expr '(' maybe_exprs ')' { $$ = mk_node("ExprCall", 2, $1, $3); }
|
||||
| '(' maybe_exprs ')' { $$ = mk_node("ExprParen", 1, $2); }
|
||||
| '[' vec_expr ']' { $$ = mk_node("ExprVec", 1, $2); }
|
||||
| CONTINUE { $$ = mk_node("ExprAgain", 0); }
|
||||
| CONTINUE ident { $$ = mk_node("ExprAgain", 1, $2); }
|
||||
| RETURN { $$ = mk_node("ExprRet", 0); }
|
||||
| RETURN expr { $$ = mk_node("ExprRet", 1, $2); }
|
||||
| BREAK { $$ = mk_node("ExprBreak", 0); }
|
||||
| BREAK ident { $$ = mk_node("ExprBreak", 1, $2); }
|
||||
| YIELD { $$ = mk_node("ExprYield", 0); }
|
||||
| YIELD expr { $$ = mk_node("ExprYield", 1, $2); }
|
||||
| expr '=' expr { $$ = mk_node("ExprAssign", 2, $1, $3); }
|
||||
| expr SHLEQ expr { $$ = mk_node("ExprAssignShl", 2, $1, $3); }
|
||||
| expr SHREQ expr { $$ = mk_node("ExprAssignShr", 2, $1, $3); }
|
||||
| expr MINUSEQ expr { $$ = mk_node("ExprAssignSub", 2, $1, $3); }
|
||||
| expr ANDEQ expr { $$ = mk_node("ExprAssignBitAnd", 2, $1, $3); }
|
||||
| expr OREQ expr { $$ = mk_node("ExprAssignBitOr", 2, $1, $3); }
|
||||
| expr PLUSEQ expr { $$ = mk_node("ExprAssignAdd", 2, $1, $3); }
|
||||
| expr STAREQ expr { $$ = mk_node("ExprAssignMul", 2, $1, $3); }
|
||||
| expr SLASHEQ expr { $$ = mk_node("ExprAssignDiv", 2, $1, $3); }
|
||||
| expr CARETEQ expr { $$ = mk_node("ExprAssignBitXor", 2, $1, $3); }
|
||||
| expr PERCENTEQ expr { $$ = mk_node("ExprAssignRem", 2, $1, $3); }
|
||||
| expr OROR expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiOr"), $1, $3); }
|
||||
| expr ANDAND expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiAnd"), $1, $3); }
|
||||
| expr EQEQ expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiEq"), $1, $3); }
|
||||
| expr NE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiNe"), $1, $3); }
|
||||
| expr '<' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiLt"), $1, $3); }
|
||||
| expr '>' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiGt"), $1, $3); }
|
||||
| expr LE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiLe"), $1, $3); }
|
||||
| expr GE expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiGe"), $1, $3); }
|
||||
| expr '|' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitOr"), $1, $3); }
|
||||
| expr '^' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitXor"), $1, $3); }
|
||||
| expr '&' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitAnd"), $1, $3); }
|
||||
| expr SHL expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiShl"), $1, $3); }
|
||||
| expr SHR expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiShr"), $1, $3); }
|
||||
| expr '+' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiAdd"), $1, $3); }
|
||||
| expr '-' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiSub"), $1, $3); }
|
||||
| expr '*' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiMul"), $1, $3); }
|
||||
| expr '/' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiDiv"), $1, $3); }
|
||||
| expr '%' expr { $$ = mk_node("ExprBinary", 3, mk_atom("BiRem"), $1, $3); }
|
||||
| expr DOTDOT { $$ = mk_node("ExprRange", 2, $1, mk_none()); }
|
||||
| expr DOTDOT expr { $$ = mk_node("ExprRange", 2, $1, $3); }
|
||||
| DOTDOT expr { $$ = mk_node("ExprRange", 2, mk_none(), $2); }
|
||||
| DOTDOT { $$ = mk_node("ExprRange", 2, mk_none(), mk_none()); }
|
||||
| expr AS ty { $$ = mk_node("ExprCast", 2, $1, $3); }
|
||||
| expr ':' ty { $$ = mk_node("ExprTypeAscr", 2, $1, $3); }
|
||||
| BOX expr { $$ = mk_node("ExprBox", 1, $2); }
|
||||
| expr_qualified_path
|
||||
| block_expr
|
||||
| block
|
||||
| nonblock_prefix_expr
|
||||
;
|
||||
|
||||
expr_nostruct
|
||||
: lit { $$ = mk_node("ExprLit", 1, $1); }
|
||||
| %prec IDENT
|
||||
path_expr { $$ = mk_node("ExprPath", 1, $1); }
|
||||
| SELF { $$ = mk_node("ExprPath", 1, mk_node("ident", 1, mk_atom("self"))); }
|
||||
| macro_expr { $$ = mk_node("ExprMac", 1, $1); }
|
||||
| expr_nostruct '?' { $$ = mk_node("ExprTry", 1, $1); }
|
||||
| expr_nostruct '.' path_generic_args_with_colons { $$ = mk_node("ExprField", 2, $1, $3); }
|
||||
| expr_nostruct '.' LIT_INTEGER { $$ = mk_node("ExprTupleIndex", 1, $1); }
|
||||
| expr_nostruct '[' maybe_expr ']' { $$ = mk_node("ExprIndex", 2, $1, $3); }
|
||||
| expr_nostruct '(' maybe_exprs ')' { $$ = mk_node("ExprCall", 2, $1, $3); }
|
||||
| '[' vec_expr ']' { $$ = mk_node("ExprVec", 1, $2); }
|
||||
| '(' maybe_exprs ')' { $$ = mk_node("ExprParen", 1, $2); }
|
||||
| CONTINUE { $$ = mk_node("ExprAgain", 0); }
|
||||
| CONTINUE ident { $$ = mk_node("ExprAgain", 1, $2); }
|
||||
| RETURN { $$ = mk_node("ExprRet", 0); }
|
||||
| RETURN expr { $$ = mk_node("ExprRet", 1, $2); }
|
||||
| BREAK { $$ = mk_node("ExprBreak", 0); }
|
||||
| BREAK ident { $$ = mk_node("ExprBreak", 1, $2); }
|
||||
| YIELD { $$ = mk_node("ExprYield", 0); }
|
||||
| YIELD expr { $$ = mk_node("ExprYield", 1, $2); }
|
||||
| expr_nostruct '=' expr_nostruct { $$ = mk_node("ExprAssign", 2, $1, $3); }
|
||||
| expr_nostruct SHLEQ expr_nostruct { $$ = mk_node("ExprAssignShl", 2, $1, $3); }
|
||||
| expr_nostruct SHREQ expr_nostruct { $$ = mk_node("ExprAssignShr", 2, $1, $3); }
|
||||
| expr_nostruct MINUSEQ expr_nostruct { $$ = mk_node("ExprAssignSub", 2, $1, $3); }
|
||||
| expr_nostruct ANDEQ expr_nostruct { $$ = mk_node("ExprAssignBitAnd", 2, $1, $3); }
|
||||
| expr_nostruct OREQ expr_nostruct { $$ = mk_node("ExprAssignBitOr", 2, $1, $3); }
|
||||
| expr_nostruct PLUSEQ expr_nostruct { $$ = mk_node("ExprAssignAdd", 2, $1, $3); }
|
||||
| expr_nostruct STAREQ expr_nostruct { $$ = mk_node("ExprAssignMul", 2, $1, $3); }
|
||||
| expr_nostruct SLASHEQ expr_nostruct { $$ = mk_node("ExprAssignDiv", 2, $1, $3); }
|
||||
| expr_nostruct CARETEQ expr_nostruct { $$ = mk_node("ExprAssignBitXor", 2, $1, $3); }
|
||||
| expr_nostruct PERCENTEQ expr_nostruct { $$ = mk_node("ExprAssignRem", 2, $1, $3); }
|
||||
| expr_nostruct OROR expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiOr"), $1, $3); }
|
||||
| expr_nostruct ANDAND expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiAnd"), $1, $3); }
|
||||
| expr_nostruct EQEQ expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiEq"), $1, $3); }
|
||||
| expr_nostruct NE expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiNe"), $1, $3); }
|
||||
| expr_nostruct '<' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiLt"), $1, $3); }
|
||||
| expr_nostruct '>' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiGt"), $1, $3); }
|
||||
| expr_nostruct LE expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiLe"), $1, $3); }
|
||||
| expr_nostruct GE expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiGe"), $1, $3); }
|
||||
| expr_nostruct '|' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitOr"), $1, $3); }
|
||||
| expr_nostruct '^' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitXor"), $1, $3); }
|
||||
| expr_nostruct '&' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiBitAnd"), $1, $3); }
|
||||
| expr_nostruct SHL expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiShl"), $1, $3); }
|
||||
| expr_nostruct SHR expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiShr"), $1, $3); }
|
||||
| expr_nostruct '+' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiAdd"), $1, $3); }
|
||||
| expr_nostruct '-' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiSub"), $1, $3); }
|
||||
| expr_nostruct '*' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiMul"), $1, $3); }
|
||||
| expr_nostruct '/' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiDiv"), $1, $3); }
|
||||
| expr_nostruct '%' expr_nostruct { $$ = mk_node("ExprBinary", 3, mk_atom("BiRem"), $1, $3); }
|
||||
| expr_nostruct DOTDOT %prec RANGE { $$ = mk_node("ExprRange", 2, $1, mk_none()); }
|
||||
| expr_nostruct DOTDOT expr_nostruct { $$ = mk_node("ExprRange", 2, $1, $3); }
|
||||
| DOTDOT expr_nostruct { $$ = mk_node("ExprRange", 2, mk_none(), $2); }
|
||||
| DOTDOT { $$ = mk_node("ExprRange", 2, mk_none(), mk_none()); }
|
||||
| expr_nostruct AS ty { $$ = mk_node("ExprCast", 2, $1, $3); }
|
||||
| expr_nostruct ':' ty { $$ = mk_node("ExprTypeAscr", 2, $1, $3); }
|
||||
| BOX expr { $$ = mk_node("ExprBox", 1, $2); }
|
||||
| expr_qualified_path
|
||||
| block_expr
|
||||
| block
|
||||
| nonblock_prefix_expr_nostruct
|
||||
;
|
||||
|
||||
nonblock_prefix_expr_nostruct
|
||||
: '-' expr_nostruct { $$ = mk_node("ExprUnary", 2, mk_atom("UnNeg"), $2); }
|
||||
| '!' expr_nostruct { $$ = mk_node("ExprUnary", 2, mk_atom("UnNot"), $2); }
|
||||
| '*' expr_nostruct { $$ = mk_node("ExprUnary", 2, mk_atom("UnDeref"), $2); }
|
||||
| '&' maybe_mut expr_nostruct { $$ = mk_node("ExprAddrOf", 2, $2, $3); }
|
||||
| ANDAND maybe_mut expr_nostruct { $$ = mk_node("ExprAddrOf", 1, mk_node("ExprAddrOf", 2, $2, $3)); }
|
||||
| lambda_expr_nostruct
|
||||
| MOVE lambda_expr_nostruct { $$ = $2; }
|
||||
;
|
||||
|
||||
nonblock_prefix_expr
|
||||
: '-' expr { $$ = mk_node("ExprUnary", 2, mk_atom("UnNeg"), $2); }
|
||||
| '!' expr { $$ = mk_node("ExprUnary", 2, mk_atom("UnNot"), $2); }
|
||||
| '*' expr { $$ = mk_node("ExprUnary", 2, mk_atom("UnDeref"), $2); }
|
||||
| '&' maybe_mut expr { $$ = mk_node("ExprAddrOf", 2, $2, $3); }
|
||||
| ANDAND maybe_mut expr { $$ = mk_node("ExprAddrOf", 1, mk_node("ExprAddrOf", 2, $2, $3)); }
|
||||
| lambda_expr
|
||||
| MOVE lambda_expr { $$ = $2; }
|
||||
;
|
||||
|
||||
expr_qualified_path
|
||||
: '<' ty_sum maybe_as_trait_ref '>' MOD_SEP ident maybe_qpath_params
|
||||
{
|
||||
$$ = mk_node("ExprQualifiedPath", 4, $2, $3, $6, $7);
|
||||
}
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident maybe_as_trait_ref '>' MOD_SEP ident
|
||||
{
|
||||
$$ = mk_node("ExprQualifiedPath", 3, mk_node("ExprQualifiedPath", 3, $2, $3, $6), $7, $10);
|
||||
}
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident generic_args maybe_as_trait_ref '>' MOD_SEP ident
|
||||
{
|
||||
$$ = mk_node("ExprQualifiedPath", 3, mk_node("ExprQualifiedPath", 4, $2, $3, $6, $7), $8, $11);
|
||||
}
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident maybe_as_trait_ref '>' MOD_SEP ident generic_args
|
||||
{
|
||||
$$ = mk_node("ExprQualifiedPath", 4, mk_node("ExprQualifiedPath", 3, $2, $3, $6), $7, $10, $11);
|
||||
}
|
||||
| SHL ty_sum maybe_as_trait_ref '>' MOD_SEP ident generic_args maybe_as_trait_ref '>' MOD_SEP ident generic_args
|
||||
{
|
||||
$$ = mk_node("ExprQualifiedPath", 4, mk_node("ExprQualifiedPath", 4, $2, $3, $6, $7), $8, $11, $12);
|
||||
}
|
||||
|
||||
maybe_qpath_params
|
||||
: MOD_SEP generic_args { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_as_trait_ref
|
||||
: AS trait_ref { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
lambda_expr
|
||||
: %prec LAMBDA
|
||||
OROR ret_ty expr { $$ = mk_node("ExprFnBlock", 3, mk_none(), $2, $3); }
|
||||
| %prec LAMBDA
|
||||
'|' '|' ret_ty expr { $$ = mk_node("ExprFnBlock", 3, mk_none(), $3, $4); }
|
||||
| %prec LAMBDA
|
||||
'|' inferrable_params '|' ret_ty expr { $$ = mk_node("ExprFnBlock", 3, $2, $4, $5); }
|
||||
| %prec LAMBDA
|
||||
'|' inferrable_params OROR lambda_expr_no_first_bar { $$ = mk_node("ExprFnBlock", 3, $2, mk_none(), $4); }
|
||||
;
|
||||
|
||||
lambda_expr_no_first_bar
|
||||
: %prec LAMBDA
|
||||
'|' ret_ty expr { $$ = mk_node("ExprFnBlock", 3, mk_none(), $2, $3); }
|
||||
| %prec LAMBDA
|
||||
inferrable_params '|' ret_ty expr { $$ = mk_node("ExprFnBlock", 3, $1, $3, $4); }
|
||||
| %prec LAMBDA
|
||||
inferrable_params OROR lambda_expr_no_first_bar { $$ = mk_node("ExprFnBlock", 3, $1, mk_none(), $3); }
|
||||
;
|
||||
|
||||
lambda_expr_nostruct
|
||||
: %prec LAMBDA
|
||||
OROR expr_nostruct { $$ = mk_node("ExprFnBlock", 2, mk_none(), $2); }
|
||||
| %prec LAMBDA
|
||||
'|' '|' ret_ty expr_nostruct { $$ = mk_node("ExprFnBlock", 3, mk_none(), $3, $4); }
|
||||
| %prec LAMBDA
|
||||
'|' inferrable_params '|' expr_nostruct { $$ = mk_node("ExprFnBlock", 2, $2, $4); }
|
||||
| %prec LAMBDA
|
||||
'|' inferrable_params OROR lambda_expr_nostruct_no_first_bar { $$ = mk_node("ExprFnBlock", 3, $2, mk_none(), $4); }
|
||||
;
|
||||
|
||||
lambda_expr_nostruct_no_first_bar
|
||||
: %prec LAMBDA
|
||||
'|' ret_ty expr_nostruct { $$ = mk_node("ExprFnBlock", 3, mk_none(), $2, $3); }
|
||||
| %prec LAMBDA
|
||||
inferrable_params '|' ret_ty expr_nostruct { $$ = mk_node("ExprFnBlock", 3, $1, $3, $4); }
|
||||
| %prec LAMBDA
|
||||
inferrable_params OROR lambda_expr_nostruct_no_first_bar { $$ = mk_node("ExprFnBlock", 3, $1, mk_none(), $3); }
|
||||
;
|
||||
|
||||
vec_expr
|
||||
: maybe_exprs
|
||||
| exprs ';' expr { $$ = mk_node("VecRepeat", 2, $1, $3); }
|
||||
;
|
||||
|
||||
struct_expr_fields
|
||||
: field_inits
|
||||
| field_inits ','
|
||||
| maybe_field_inits default_field_init { $$ = ext_node($1, 1, $2); }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
maybe_field_inits
|
||||
: field_inits
|
||||
| field_inits ','
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
field_inits
|
||||
: field_init { $$ = mk_node("FieldInits", 1, $1); }
|
||||
| field_inits ',' field_init { $$ = ext_node($1, 1, $3); }
|
||||
;
|
||||
|
||||
field_init
|
||||
: ident { $$ = mk_node("FieldInit", 1, $1); }
|
||||
| ident ':' expr { $$ = mk_node("FieldInit", 2, $1, $3); }
|
||||
| LIT_INTEGER ':' expr { $$ = mk_node("FieldInit", 2, mk_atom(yytext), $3); }
|
||||
;
|
||||
|
||||
default_field_init
|
||||
: DOTDOT expr { $$ = mk_node("DefaultFieldInit", 1, $2); }
|
||||
;
|
||||
|
||||
block_expr
|
||||
: expr_match
|
||||
| expr_if
|
||||
| expr_if_let
|
||||
| expr_while
|
||||
| expr_while_let
|
||||
| expr_loop
|
||||
| expr_for
|
||||
| UNSAFE block { $$ = mk_node("UnsafeBlock", 1, $2); }
|
||||
| path_expr '!' maybe_ident braces_delimited_token_trees { $$ = mk_node("Macro", 3, $1, $3, $4); }
|
||||
;
|
||||
|
||||
full_block_expr
|
||||
: block_expr
|
||||
| block_expr_dot
|
||||
;
|
||||
|
||||
block_expr_dot
|
||||
: block_expr '.' path_generic_args_with_colons %prec IDENT { $$ = mk_node("ExprField", 2, $1, $3); }
|
||||
| block_expr_dot '.' path_generic_args_with_colons %prec IDENT { $$ = mk_node("ExprField", 2, $1, $3); }
|
||||
| block_expr '.' path_generic_args_with_colons '[' maybe_expr ']' { $$ = mk_node("ExprIndex", 3, $1, $3, $5); }
|
||||
| block_expr_dot '.' path_generic_args_with_colons '[' maybe_expr ']' { $$ = mk_node("ExprIndex", 3, $1, $3, $5); }
|
||||
| block_expr '.' path_generic_args_with_colons '(' maybe_exprs ')' { $$ = mk_node("ExprCall", 3, $1, $3, $5); }
|
||||
| block_expr_dot '.' path_generic_args_with_colons '(' maybe_exprs ')' { $$ = mk_node("ExprCall", 3, $1, $3, $5); }
|
||||
| block_expr '.' LIT_INTEGER { $$ = mk_node("ExprTupleIndex", 1, $1); }
|
||||
| block_expr_dot '.' LIT_INTEGER { $$ = mk_node("ExprTupleIndex", 1, $1); }
|
||||
;
|
||||
|
||||
expr_match
|
||||
: MATCH expr_nostruct '{' '}' { $$ = mk_node("ExprMatch", 1, $2); }
|
||||
| MATCH expr_nostruct '{' match_clauses '}' { $$ = mk_node("ExprMatch", 2, $2, $4); }
|
||||
| MATCH expr_nostruct '{' match_clauses nonblock_match_clause '}' { $$ = mk_node("ExprMatch", 2, $2, ext_node($4, 1, $5)); }
|
||||
| MATCH expr_nostruct '{' nonblock_match_clause '}' { $$ = mk_node("ExprMatch", 2, $2, mk_node("Arms", 1, $4)); }
|
||||
;
|
||||
|
||||
match_clauses
|
||||
: match_clause { $$ = mk_node("Arms", 1, $1); }
|
||||
| match_clauses match_clause { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
match_clause
|
||||
: nonblock_match_clause ','
|
||||
| block_match_clause
|
||||
| block_match_clause ','
|
||||
;
|
||||
|
||||
nonblock_match_clause
|
||||
: maybe_outer_attrs pats_or maybe_guard FAT_ARROW nonblock_expr { $$ = mk_node("ArmNonblock", 4, $1, $2, $3, $5); }
|
||||
| maybe_outer_attrs pats_or maybe_guard FAT_ARROW block_expr_dot { $$ = mk_node("ArmNonblock", 4, $1, $2, $3, $5); }
|
||||
;
|
||||
|
||||
block_match_clause
|
||||
: maybe_outer_attrs pats_or maybe_guard FAT_ARROW block { $$ = mk_node("ArmBlock", 4, $1, $2, $3, $5); }
|
||||
| maybe_outer_attrs pats_or maybe_guard FAT_ARROW block_expr { $$ = mk_node("ArmBlock", 4, $1, $2, $3, $5); }
|
||||
;
|
||||
|
||||
maybe_guard
|
||||
: IF expr_nostruct { $$ = $2; }
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
expr_if
|
||||
: IF expr_nostruct block { $$ = mk_node("ExprIf", 2, $2, $3); }
|
||||
| IF expr_nostruct block ELSE block_or_if { $$ = mk_node("ExprIf", 3, $2, $3, $5); }
|
||||
;
|
||||
|
||||
expr_if_let
|
||||
: IF LET pat '=' expr_nostruct block { $$ = mk_node("ExprIfLet", 3, $3, $5, $6); }
|
||||
| IF LET pat '=' expr_nostruct block ELSE block_or_if { $$ = mk_node("ExprIfLet", 4, $3, $5, $6, $8); }
|
||||
;
|
||||
|
||||
block_or_if
|
||||
: block
|
||||
| expr_if
|
||||
| expr_if_let
|
||||
;
|
||||
|
||||
expr_while
|
||||
: maybe_label WHILE expr_nostruct block { $$ = mk_node("ExprWhile", 3, $1, $3, $4); }
|
||||
;
|
||||
|
||||
expr_while_let
|
||||
: maybe_label WHILE LET pat '=' expr_nostruct block { $$ = mk_node("ExprWhileLet", 4, $1, $4, $6, $7); }
|
||||
;
|
||||
|
||||
expr_loop
|
||||
: maybe_label LOOP block { $$ = mk_node("ExprLoop", 2, $1, $3); }
|
||||
;
|
||||
|
||||
expr_for
|
||||
: maybe_label FOR pat IN expr_nostruct block { $$ = mk_node("ExprForLoop", 4, $1, $3, $5, $6); }
|
||||
;
|
||||
|
||||
maybe_label
|
||||
: lifetime ':'
|
||||
| %empty { $$ = mk_none(); }
|
||||
;
|
||||
|
||||
let
|
||||
: LET pat maybe_ty_ascription maybe_init_expr ';' { $$ = mk_node("DeclLocal", 3, $2, $3, $4); }
|
||||
;
|
||||
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
// Part 5: Macros and misc. rules
|
||||
////////////////////////////////////////////////////////////////////////
|
||||
|
||||
lit
|
||||
: LIT_BYTE { $$ = mk_node("LitByte", 1, mk_atom(yytext)); }
|
||||
| LIT_CHAR { $$ = mk_node("LitChar", 1, mk_atom(yytext)); }
|
||||
| LIT_INTEGER { $$ = mk_node("LitInteger", 1, mk_atom(yytext)); }
|
||||
| LIT_FLOAT { $$ = mk_node("LitFloat", 1, mk_atom(yytext)); }
|
||||
| TRUE { $$ = mk_node("LitBool", 1, mk_atom(yytext)); }
|
||||
| FALSE { $$ = mk_node("LitBool", 1, mk_atom(yytext)); }
|
||||
| str
|
||||
;
|
||||
|
||||
str
|
||||
: LIT_STR { $$ = mk_node("LitStr", 1, mk_atom(yytext), mk_atom("CookedStr")); }
|
||||
| LIT_STR_RAW { $$ = mk_node("LitStr", 1, mk_atom(yytext), mk_atom("RawStr")); }
|
||||
| LIT_BYTE_STR { $$ = mk_node("LitByteStr", 1, mk_atom(yytext), mk_atom("ByteStr")); }
|
||||
| LIT_BYTE_STR_RAW { $$ = mk_node("LitByteStr", 1, mk_atom(yytext), mk_atom("RawByteStr")); }
|
||||
;
|
||||
|
||||
maybe_ident
|
||||
: %empty { $$ = mk_none(); }
|
||||
| ident
|
||||
;
|
||||
|
||||
ident
|
||||
: IDENT { $$ = mk_node("ident", 1, mk_atom(yytext)); }
|
||||
// Weak keywords that can be used as identifiers
|
||||
| CATCH { $$ = mk_node("ident", 1, mk_atom(yytext)); }
|
||||
| DEFAULT { $$ = mk_node("ident", 1, mk_atom(yytext)); }
|
||||
| UNION { $$ = mk_node("ident", 1, mk_atom(yytext)); }
|
||||
;
|
||||
|
||||
unpaired_token
|
||||
: SHL { $$ = mk_atom(yytext); }
|
||||
| SHR { $$ = mk_atom(yytext); }
|
||||
| LE { $$ = mk_atom(yytext); }
|
||||
| EQEQ { $$ = mk_atom(yytext); }
|
||||
| NE { $$ = mk_atom(yytext); }
|
||||
| GE { $$ = mk_atom(yytext); }
|
||||
| ANDAND { $$ = mk_atom(yytext); }
|
||||
| OROR { $$ = mk_atom(yytext); }
|
||||
| LARROW { $$ = mk_atom(yytext); }
|
||||
| SHLEQ { $$ = mk_atom(yytext); }
|
||||
| SHREQ { $$ = mk_atom(yytext); }
|
||||
| MINUSEQ { $$ = mk_atom(yytext); }
|
||||
| ANDEQ { $$ = mk_atom(yytext); }
|
||||
| OREQ { $$ = mk_atom(yytext); }
|
||||
| PLUSEQ { $$ = mk_atom(yytext); }
|
||||
| STAREQ { $$ = mk_atom(yytext); }
|
||||
| SLASHEQ { $$ = mk_atom(yytext); }
|
||||
| CARETEQ { $$ = mk_atom(yytext); }
|
||||
| PERCENTEQ { $$ = mk_atom(yytext); }
|
||||
| DOTDOT { $$ = mk_atom(yytext); }
|
||||
| DOTDOTDOT { $$ = mk_atom(yytext); }
|
||||
| MOD_SEP { $$ = mk_atom(yytext); }
|
||||
| RARROW { $$ = mk_atom(yytext); }
|
||||
| FAT_ARROW { $$ = mk_atom(yytext); }
|
||||
| LIT_BYTE { $$ = mk_atom(yytext); }
|
||||
| LIT_CHAR { $$ = mk_atom(yytext); }
|
||||
| LIT_INTEGER { $$ = mk_atom(yytext); }
|
||||
| LIT_FLOAT { $$ = mk_atom(yytext); }
|
||||
| LIT_STR { $$ = mk_atom(yytext); }
|
||||
| LIT_STR_RAW { $$ = mk_atom(yytext); }
|
||||
| LIT_BYTE_STR { $$ = mk_atom(yytext); }
|
||||
| LIT_BYTE_STR_RAW { $$ = mk_atom(yytext); }
|
||||
| IDENT { $$ = mk_atom(yytext); }
|
||||
| UNDERSCORE { $$ = mk_atom(yytext); }
|
||||
| LIFETIME { $$ = mk_atom(yytext); }
|
||||
| SELF { $$ = mk_atom(yytext); }
|
||||
| STATIC { $$ = mk_atom(yytext); }
|
||||
| ABSTRACT { $$ = mk_atom(yytext); }
|
||||
| ALIGNOF { $$ = mk_atom(yytext); }
|
||||
| AS { $$ = mk_atom(yytext); }
|
||||
| BECOME { $$ = mk_atom(yytext); }
|
||||
| BREAK { $$ = mk_atom(yytext); }
|
||||
| CATCH { $$ = mk_atom(yytext); }
|
||||
| CRATE { $$ = mk_atom(yytext); }
|
||||
| DEFAULT { $$ = mk_atom(yytext); }
|
||||
| DO { $$ = mk_atom(yytext); }
|
||||
| ELSE { $$ = mk_atom(yytext); }
|
||||
| ENUM { $$ = mk_atom(yytext); }
|
||||
| EXTERN { $$ = mk_atom(yytext); }
|
||||
| FALSE { $$ = mk_atom(yytext); }
|
||||
| FINAL { $$ = mk_atom(yytext); }
|
||||
| FN { $$ = mk_atom(yytext); }
|
||||
| FOR { $$ = mk_atom(yytext); }
|
||||
| IF { $$ = mk_atom(yytext); }
|
||||
| IMPL { $$ = mk_atom(yytext); }
|
||||
| IN { $$ = mk_atom(yytext); }
|
||||
| LET { $$ = mk_atom(yytext); }
|
||||
| LOOP { $$ = mk_atom(yytext); }
|
||||
| MACRO { $$ = mk_atom(yytext); }
|
||||
| MATCH { $$ = mk_atom(yytext); }
|
||||
| MOD { $$ = mk_atom(yytext); }
|
||||
| MOVE { $$ = mk_atom(yytext); }
|
||||
| MUT { $$ = mk_atom(yytext); }
|
||||
| OFFSETOF { $$ = mk_atom(yytext); }
|
||||
| OVERRIDE { $$ = mk_atom(yytext); }
|
||||
| PRIV { $$ = mk_atom(yytext); }
|
||||
| PUB { $$ = mk_atom(yytext); }
|
||||
| PURE { $$ = mk_atom(yytext); }
|
||||
| REF { $$ = mk_atom(yytext); }
|
||||
| RETURN { $$ = mk_atom(yytext); }
|
||||
| STRUCT { $$ = mk_atom(yytext); }
|
||||
| SIZEOF { $$ = mk_atom(yytext); }
|
||||
| SUPER { $$ = mk_atom(yytext); }
|
||||
| TRUE { $$ = mk_atom(yytext); }
|
||||
| TRAIT { $$ = mk_atom(yytext); }
|
||||
| TYPE { $$ = mk_atom(yytext); }
|
||||
| UNION { $$ = mk_atom(yytext); }
|
||||
| UNSAFE { $$ = mk_atom(yytext); }
|
||||
| UNSIZED { $$ = mk_atom(yytext); }
|
||||
| USE { $$ = mk_atom(yytext); }
|
||||
| VIRTUAL { $$ = mk_atom(yytext); }
|
||||
| WHILE { $$ = mk_atom(yytext); }
|
||||
| YIELD { $$ = mk_atom(yytext); }
|
||||
| CONTINUE { $$ = mk_atom(yytext); }
|
||||
| PROC { $$ = mk_atom(yytext); }
|
||||
| BOX { $$ = mk_atom(yytext); }
|
||||
| CONST { $$ = mk_atom(yytext); }
|
||||
| WHERE { $$ = mk_atom(yytext); }
|
||||
| TYPEOF { $$ = mk_atom(yytext); }
|
||||
| INNER_DOC_COMMENT { $$ = mk_atom(yytext); }
|
||||
| OUTER_DOC_COMMENT { $$ = mk_atom(yytext); }
|
||||
| SHEBANG { $$ = mk_atom(yytext); }
|
||||
| STATIC_LIFETIME { $$ = mk_atom(yytext); }
|
||||
| ';' { $$ = mk_atom(yytext); }
|
||||
| ',' { $$ = mk_atom(yytext); }
|
||||
| '.' { $$ = mk_atom(yytext); }
|
||||
| '@' { $$ = mk_atom(yytext); }
|
||||
| '#' { $$ = mk_atom(yytext); }
|
||||
| '~' { $$ = mk_atom(yytext); }
|
||||
| ':' { $$ = mk_atom(yytext); }
|
||||
| '$' { $$ = mk_atom(yytext); }
|
||||
| '=' { $$ = mk_atom(yytext); }
|
||||
| '?' { $$ = mk_atom(yytext); }
|
||||
| '!' { $$ = mk_atom(yytext); }
|
||||
| '<' { $$ = mk_atom(yytext); }
|
||||
| '>' { $$ = mk_atom(yytext); }
|
||||
| '-' { $$ = mk_atom(yytext); }
|
||||
| '&' { $$ = mk_atom(yytext); }
|
||||
| '|' { $$ = mk_atom(yytext); }
|
||||
| '+' { $$ = mk_atom(yytext); }
|
||||
| '*' { $$ = mk_atom(yytext); }
|
||||
| '/' { $$ = mk_atom(yytext); }
|
||||
| '^' { $$ = mk_atom(yytext); }
|
||||
| '%' { $$ = mk_atom(yytext); }
|
||||
;
|
||||
|
||||
token_trees
|
||||
: %empty { $$ = mk_node("TokenTrees", 0); }
|
||||
| token_trees token_tree { $$ = ext_node($1, 1, $2); }
|
||||
;
|
||||
|
||||
token_tree
|
||||
: delimited_token_trees
|
||||
| unpaired_token { $$ = mk_node("TTTok", 1, $1); }
|
||||
;
|
||||
|
||||
delimited_token_trees
|
||||
: parens_delimited_token_trees
|
||||
| braces_delimited_token_trees
|
||||
| brackets_delimited_token_trees
|
||||
;
|
||||
|
||||
parens_delimited_token_trees
|
||||
: '(' token_trees ')'
|
||||
{
|
||||
$$ = mk_node("TTDelim", 3,
|
||||
mk_node("TTTok", 1, mk_atom("(")),
|
||||
$2,
|
||||
mk_node("TTTok", 1, mk_atom(")")));
|
||||
}
|
||||
;
|
||||
|
||||
braces_delimited_token_trees
|
||||
: '{' token_trees '}'
|
||||
{
|
||||
$$ = mk_node("TTDelim", 3,
|
||||
mk_node("TTTok", 1, mk_atom("{")),
|
||||
$2,
|
||||
mk_node("TTTok", 1, mk_atom("}")));
|
||||
}
|
||||
;
|
||||
|
||||
brackets_delimited_token_trees
|
||||
: '[' token_trees ']'
|
||||
{
|
||||
$$ = mk_node("TTDelim", 3,
|
||||
mk_node("TTTok", 1, mk_atom("[")),
|
||||
$2,
|
||||
mk_node("TTTok", 1, mk_atom("]")));
|
||||
}
|
||||
;
|
||||
@@ -1,64 +0,0 @@
|
||||
Rust's lexical grammar is not context-free. Raw string literals are the source
|
||||
of the problem. Informally, a raw string literal is an `r`, followed by `N`
|
||||
hashes (where N can be zero), a quote, any characters, then a quote followed
|
||||
by `N` hashes. Critically, once inside the first pair of quotes,
|
||||
another quote cannot be followed by `N` consecutive hashes. e.g.
|
||||
`r###""###"###` is invalid.
|
||||
|
||||
This grammar describes this as best possible:
|
||||
|
||||
R -> 'r' S
|
||||
S -> '"' B '"'
|
||||
S -> '#' S '#'
|
||||
B -> . B
|
||||
B -> ε
|
||||
|
||||
Where `.` represents any character, and `ε` the empty string. Consider the
|
||||
string `r#""#"#`. This string is not a valid raw string literal, but can be
|
||||
accepted as one by the above grammar, using the derivation:
|
||||
|
||||
R : #""#"#
|
||||
S : ""#"
|
||||
S : "#
|
||||
B : #
|
||||
B : ε
|
||||
|
||||
(Where `T : U` means the rule `T` is applied, and `U` is the remainder of the
|
||||
string.) The difficulty arises from the fact that it is fundamentally
|
||||
context-sensitive. In particular, the context needed is the number of hashes.
|
||||
|
||||
To prove that Rust's string literals are not context-free, we will use
|
||||
the fact that context-free languages are closed under intersection with
|
||||
regular languages, and the
|
||||
[pumping lemma for context-free languages](https://en.wikipedia.org/wiki/Pumping_lemma_for_context-free_languages).
|
||||
|
||||
Consider the regular language `R = r#+""#*"#+`. If Rust's raw string literals are
|
||||
context-free, then their intersection with `R`, `R'`, should also be context-free.
|
||||
Therefore, to prove that raw string literals are not context-free,
|
||||
it is sufficient to prove that `R'` is not context-free.
|
||||
|
||||
The language `R'` is `{r#^n""#^m"#^n | m < n}`.
|
||||
|
||||
Assume `R'` *is* context-free. Then `R'` has some pumping length `p > 0` for which
|
||||
the pumping lemma applies. Consider the following string `s` in `R'`:
|
||||
|
||||
`r#^p""#^{p-1}"#^p`
|
||||
|
||||
e.g. for `p = 2`: `s = r##""#"##`
|
||||
|
||||
Then `s = uvwxy` for some choice of `uvwxy` such that `vx` is non-empty,
|
||||
`|vwx| < p+1`, and `uv^iwx^iy` is in `R'` for all `i >= 0`.
|
||||
|
||||
Neither `v` nor `x` can contain a `"` or `r`, as the number of these characters
|
||||
in any string in `R'` is fixed. So `v` and `x` contain only hashes.
|
||||
Consequently, of the three sequences of hashes, `v` and `x` combined
|
||||
can only pump two of them.
|
||||
If we ever choose the central sequence of hashes, then one of the outer sequences
|
||||
will not grow when we pump, leading to an imbalance between the outer sequences.
|
||||
Therefore, we must pump both outer sequences of hashes. However,
|
||||
there are `p+2` characters between these two sequences of hashes, and `|vwx|` must
|
||||
be less than `p+1`. Therefore we have a contradiction, and `R'` must not be
|
||||
context-free.
|
||||
|
||||
Since `R'` is not context-free, it follows that the Rust's raw string literals
|
||||
must not be context-free.
|
||||
@@ -1,66 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
# ignore-tidy-linelength
|
||||
|
||||
import sys
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import argparse
|
||||
|
||||
# usage: testparser.py [-h] [-p PARSER [PARSER ...]] -s SOURCE_DIR
|
||||
|
||||
# Parsers should read from stdin and return exit status 0 for a
|
||||
# successful parse, and nonzero for an unsuccessful parse
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('-p', '--parser', nargs='+')
|
||||
parser.add_argument('-s', '--source-dir', nargs=1, required=True)
|
||||
args = parser.parse_args(sys.argv[1:])
|
||||
|
||||
total = 0
|
||||
ok = {}
|
||||
bad = {}
|
||||
for parser in args.parser:
|
||||
ok[parser] = 0
|
||||
bad[parser] = []
|
||||
devnull = open(os.devnull, 'w')
|
||||
print("\n")
|
||||
|
||||
for base, dirs, files in os.walk(args.source_dir[0]):
|
||||
for f in filter(lambda p: p.endswith('.rs'), files):
|
||||
p = os.path.join(base, f)
|
||||
parse_fail = 'parse-fail' in p
|
||||
if sys.version_info.major == 3:
|
||||
lines = open(p, encoding='utf-8').readlines()
|
||||
else:
|
||||
lines = open(p).readlines()
|
||||
if any('ignore-test' in line or 'ignore-lexer-test' in line for line in lines):
|
||||
continue
|
||||
total += 1
|
||||
for parser in args.parser:
|
||||
if subprocess.call(parser, stdin=open(p), stderr=subprocess.STDOUT, stdout=devnull) == 0:
|
||||
if parse_fail:
|
||||
bad[parser].append(p)
|
||||
else:
|
||||
ok[parser] += 1
|
||||
else:
|
||||
if parse_fail:
|
||||
ok[parser] += 1
|
||||
else:
|
||||
bad[parser].append(p)
|
||||
parser_stats = ', '.join(['{}: {}'.format(parser, ok[parser]) for parser in args.parser])
|
||||
sys.stdout.write("\033[K\r total: {}, {}, scanned {}"
|
||||
.format(total, os.path.relpath(parser_stats), os.path.relpath(p)))
|
||||
|
||||
devnull.close()
|
||||
|
||||
print("\n")
|
||||
|
||||
for parser in args.parser:
|
||||
filename = os.path.basename(parser) + '.bad'
|
||||
print("writing {} files that did not yield the correct result with {} to {}".format(len(bad[parser]), parser, filename))
|
||||
with open(filename, "w") as f:
|
||||
for p in bad[parser]:
|
||||
f.write(p)
|
||||
f.write("\n")
|
||||
@@ -1,99 +0,0 @@
|
||||
enum Token {
|
||||
SHL = 257, // Parser generators reserve 0-256 for char literals
|
||||
SHR,
|
||||
LE,
|
||||
EQEQ,
|
||||
NE,
|
||||
GE,
|
||||
ANDAND,
|
||||
OROR,
|
||||
SHLEQ,
|
||||
SHREQ,
|
||||
MINUSEQ,
|
||||
ANDEQ,
|
||||
OREQ,
|
||||
PLUSEQ,
|
||||
STAREQ,
|
||||
SLASHEQ,
|
||||
CARETEQ,
|
||||
PERCENTEQ,
|
||||
DOTDOT,
|
||||
DOTDOTDOT,
|
||||
MOD_SEP,
|
||||
LARROW,
|
||||
RARROW,
|
||||
FAT_ARROW,
|
||||
LIT_BYTE,
|
||||
LIT_CHAR,
|
||||
LIT_INTEGER,
|
||||
LIT_FLOAT,
|
||||
LIT_STR,
|
||||
LIT_STR_RAW,
|
||||
LIT_BYTE_STR,
|
||||
LIT_BYTE_STR_RAW,
|
||||
IDENT,
|
||||
UNDERSCORE,
|
||||
LIFETIME,
|
||||
|
||||
// keywords
|
||||
SELF,
|
||||
STATIC,
|
||||
ABSTRACT,
|
||||
ALIGNOF,
|
||||
AS,
|
||||
BECOME,
|
||||
BREAK,
|
||||
CATCH,
|
||||
CRATE,
|
||||
DEFAULT,
|
||||
DO,
|
||||
ELSE,
|
||||
ENUM,
|
||||
EXTERN,
|
||||
FALSE,
|
||||
FINAL,
|
||||
FN,
|
||||
FOR,
|
||||
IF,
|
||||
IMPL,
|
||||
IN,
|
||||
LET,
|
||||
LOOP,
|
||||
MACRO,
|
||||
MATCH,
|
||||
MOD,
|
||||
MOVE,
|
||||
MUT,
|
||||
OFFSETOF,
|
||||
OVERRIDE,
|
||||
PRIV,
|
||||
PUB,
|
||||
PURE,
|
||||
REF,
|
||||
RETURN,
|
||||
SIZEOF,
|
||||
STRUCT,
|
||||
SUPER,
|
||||
UNION,
|
||||
TRUE,
|
||||
TRAIT,
|
||||
TYPE,
|
||||
UNSAFE,
|
||||
UNSIZED,
|
||||
USE,
|
||||
VIRTUAL,
|
||||
WHILE,
|
||||
YIELD,
|
||||
CONTINUE,
|
||||
PROC,
|
||||
BOX,
|
||||
CONST,
|
||||
WHERE,
|
||||
TYPEOF,
|
||||
INNER_DOC_COMMENT,
|
||||
OUTER_DOC_COMMENT,
|
||||
|
||||
SHEBANG,
|
||||
SHEBANG_LINE,
|
||||
STATIC_LIFETIME
|
||||
};
|
||||
@@ -410,7 +410,7 @@ fn string(&mut self, start: usize) -> &'a str {
|
||||
&self.input[start..self.input.len()]
|
||||
}
|
||||
|
||||
/// Parses an Argument structure, or what's contained within braces inside the format string
|
||||
/// Parses an `Argument` structure, or what's contained within braces inside the format string.
|
||||
fn argument(&mut self) -> Argument<'a> {
|
||||
let pos = self.position();
|
||||
let format = self.format();
|
||||
@@ -464,7 +464,7 @@ fn position(&mut self) -> Option<Position> {
|
||||
}
|
||||
|
||||
/// Parses a format specifier at the current position, returning all of the
|
||||
/// relevant information in the FormatSpec struct.
|
||||
/// relevant information in the `FormatSpec` struct.
|
||||
fn format(&mut self) -> FormatSpec<'a> {
|
||||
let mut spec = FormatSpec {
|
||||
fill: None,
|
||||
@@ -571,7 +571,7 @@ fn format(&mut self) -> FormatSpec<'a> {
|
||||
spec
|
||||
}
|
||||
|
||||
/// Parses a Count parameter at the current position. This does not check
|
||||
/// Parses a `Count` parameter at the current position. This does not check
|
||||
/// for 'CountIsNextParam' because that is only used in precision, not
|
||||
/// width.
|
||||
fn count(&mut self, start: usize) -> (Count, Option<InnerSpan>) {
|
||||
|
||||
@@ -988,10 +988,12 @@ fn lower_attr(&mut self, attr: &Attribute) -> Attribute {
|
||||
// lower attributes (we use the AST version) there is nowhere to keep
|
||||
// the `HirId`s. We don't actually need HIR version of attributes anyway.
|
||||
Attribute {
|
||||
item: AttrItem {
|
||||
path: attr.path.clone(),
|
||||
tokens: self.lower_token_stream(attr.tokens.clone()),
|
||||
},
|
||||
id: attr.id,
|
||||
style: attr.style,
|
||||
path: attr.path.clone(),
|
||||
tokens: self.lower_token_stream(attr.tokens.clone()),
|
||||
is_sugared_doc: attr.is_sugared_doc,
|
||||
span: attr.span,
|
||||
}
|
||||
|
||||
@@ -196,6 +196,11 @@ fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHas
|
||||
}
|
||||
}
|
||||
|
||||
impl_stable_hash_for!(struct ::syntax::ast::AttrItem {
|
||||
path,
|
||||
tokens,
|
||||
});
|
||||
|
||||
impl<'a> HashStable<StableHashingContext<'a>> for ast::Attribute {
|
||||
fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHasher) {
|
||||
// Make sure that these have been filtered out.
|
||||
@@ -203,19 +208,15 @@ fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHas
|
||||
debug_assert!(!self.is_sugared_doc);
|
||||
|
||||
let ast::Attribute {
|
||||
ref item,
|
||||
id: _,
|
||||
style,
|
||||
ref path,
|
||||
ref tokens,
|
||||
is_sugared_doc: _,
|
||||
span,
|
||||
} = *self;
|
||||
|
||||
item.hash_stable(hcx, hasher);
|
||||
style.hash_stable(hcx, hasher);
|
||||
path.hash_stable(hcx, hasher);
|
||||
for tt in tokens.trees() {
|
||||
tt.hash_stable(hcx, hasher);
|
||||
}
|
||||
span.hash_stable(hcx, hasher);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -23,7 +23,7 @@
|
||||
use crate::ty::subst::{GenericArg, InternalSubsts, SubstsRef};
|
||||
use crate::ty::{self, GenericParamDefKind, Ty, TyCtxt, InferConst};
|
||||
use crate::ty::{FloatVid, IntVid, TyVid, ConstVid};
|
||||
use crate::util::nodemap::FxHashMap;
|
||||
use crate::util::nodemap::{FxHashMap, FxHashSet};
|
||||
|
||||
use errors::DiagnosticBuilder;
|
||||
use rustc_data_structures::sync::Lrc;
|
||||
@@ -155,6 +155,8 @@ pub struct InferCtxt<'a, 'tcx> {
|
||||
/// avoid reporting the same error twice.
|
||||
pub reported_trait_errors: RefCell<FxHashMap<Span, Vec<ty::Predicate<'tcx>>>>,
|
||||
|
||||
pub reported_closure_mismatch: RefCell<FxHashSet<(Span, Option<Span>)>>,
|
||||
|
||||
/// When an error occurs, we want to avoid reporting "derived"
|
||||
/// errors that are due to this original failure. Normally, we
|
||||
/// handle this with the `err_count_on_creation` count, which
|
||||
@@ -538,6 +540,7 @@ pub fn enter<R>(&mut self, f: impl for<'a> FnOnce(InferCtxt<'a, 'tcx>) -> R) ->
|
||||
selection_cache: Default::default(),
|
||||
evaluation_cache: Default::default(),
|
||||
reported_trait_errors: Default::default(),
|
||||
reported_closure_mismatch: Default::default(),
|
||||
tainted_by_errors_flag: Cell::new(false),
|
||||
err_count_on_creation: tcx.sess.err_count(),
|
||||
in_snapshot: Cell::new(false),
|
||||
|
||||
@@ -24,7 +24,7 @@
|
||||
use crate::infer::{self, InferCtxt};
|
||||
use crate::infer::type_variable::{TypeVariableOrigin, TypeVariableOriginKind};
|
||||
use crate::session::DiagnosticMessageId;
|
||||
use crate::ty::{self, AdtKind, ToPredicate, ToPolyTraitRef, Ty, TyCtxt, TypeFoldable};
|
||||
use crate::ty::{self, AdtKind, DefIdTree, ToPredicate, ToPolyTraitRef, Ty, TyCtxt, TypeFoldable};
|
||||
use crate::ty::GenericParamDefKind;
|
||||
use crate::ty::error::ExpectedFound;
|
||||
use crate::ty::fast_reject;
|
||||
@@ -37,7 +37,7 @@
|
||||
use std::fmt;
|
||||
use syntax::ast;
|
||||
use syntax::symbol::{sym, kw};
|
||||
use syntax_pos::{DUMMY_SP, Span, ExpnKind};
|
||||
use syntax_pos::{DUMMY_SP, Span, ExpnKind, MultiSpan};
|
||||
|
||||
impl<'a, 'tcx> InferCtxt<'a, 'tcx> {
|
||||
pub fn report_fulfillment_errors(
|
||||
@@ -550,7 +550,8 @@ pub fn report_overflow_error<T>(&self,
|
||||
self.suggest_new_overflow_limit(&mut err);
|
||||
}
|
||||
|
||||
self.note_obligation_cause(&mut err, obligation);
|
||||
self.note_obligation_cause_code(&mut err, &obligation.predicate, &obligation.cause.code,
|
||||
&mut vec![]);
|
||||
|
||||
err.emit();
|
||||
self.tcx.sess.abort_if_errors();
|
||||
@@ -885,6 +886,14 @@ pub fn report_selection_error(
|
||||
self.tcx.hir().span_if_local(did)
|
||||
).map(|sp| self.tcx.sess.source_map().def_span(sp)); // the sp could be an fn def
|
||||
|
||||
if self.reported_closure_mismatch.borrow().contains(&(span, found_span)) {
|
||||
// We check closures twice, with obligations flowing in different directions,
|
||||
// but we want to complain about them only once.
|
||||
return;
|
||||
}
|
||||
|
||||
self.reported_closure_mismatch.borrow_mut().insert((span, found_span));
|
||||
|
||||
let found = match found_trait_ref.skip_binder().substs.type_at(1).kind {
|
||||
ty::Tuple(ref tys) => vec![ArgKind::empty(); tys.len()],
|
||||
_ => vec![ArgKind::empty()],
|
||||
@@ -940,7 +949,9 @@ pub fn report_selection_error(
|
||||
bug!("overflow should be handled before the `report_selection_error` path");
|
||||
}
|
||||
};
|
||||
|
||||
self.note_obligation_cause(&mut err, obligation);
|
||||
|
||||
err.emit();
|
||||
}
|
||||
|
||||
@@ -1604,15 +1615,165 @@ fn fold_ty(&mut self, ty: Ty<'tcx>) -> Ty<'tcx> {
|
||||
})
|
||||
}
|
||||
|
||||
fn note_obligation_cause<T>(&self,
|
||||
err: &mut DiagnosticBuilder<'_>,
|
||||
obligation: &Obligation<'tcx, T>)
|
||||
where T: fmt::Display
|
||||
{
|
||||
self.note_obligation_cause_code(err,
|
||||
&obligation.predicate,
|
||||
&obligation.cause.code,
|
||||
&mut vec![]);
|
||||
fn note_obligation_cause(
|
||||
&self,
|
||||
err: &mut DiagnosticBuilder<'_>,
|
||||
obligation: &PredicateObligation<'tcx>,
|
||||
) {
|
||||
// First, attempt to add note to this error with an async-await-specific
|
||||
// message, and fall back to regular note otherwise.
|
||||
if !self.note_obligation_cause_for_async_await(err, obligation) {
|
||||
self.note_obligation_cause_code(err, &obligation.predicate, &obligation.cause.code,
|
||||
&mut vec![]);
|
||||
}
|
||||
}
|
||||
|
||||
/// Adds an async-await specific note to the diagnostic:
|
||||
///
|
||||
/// ```ignore (diagnostic)
|
||||
/// note: future does not implement `std::marker::Send` because this value is used across an
|
||||
/// await
|
||||
/// --> $DIR/issue-64130-non-send-future-diags.rs:15:5
|
||||
/// |
|
||||
/// LL | let g = x.lock().unwrap();
|
||||
/// | - has type `std::sync::MutexGuard<'_, u32>`
|
||||
/// LL | baz().await;
|
||||
/// | ^^^^^^^^^^^ await occurs here, with `g` maybe used later
|
||||
/// LL | }
|
||||
/// | - `g` is later dropped here
|
||||
/// ```
|
||||
///
|
||||
/// Returns `true` if an async-await specific note was added to the diagnostic.
|
||||
fn note_obligation_cause_for_async_await(
|
||||
&self,
|
||||
err: &mut DiagnosticBuilder<'_>,
|
||||
obligation: &PredicateObligation<'tcx>,
|
||||
) -> bool {
|
||||
debug!("note_obligation_cause_for_async_await: obligation.predicate={:?} \
|
||||
obligation.cause.span={:?}", obligation.predicate, obligation.cause.span);
|
||||
let source_map = self.tcx.sess.source_map();
|
||||
|
||||
// Look into the obligation predicate to determine the type in the generator which meant
|
||||
// that the predicate was not satisifed.
|
||||
let (trait_ref, target_ty) = match obligation.predicate {
|
||||
ty::Predicate::Trait(trait_predicate) =>
|
||||
(trait_predicate.skip_binder().trait_ref, trait_predicate.skip_binder().self_ty()),
|
||||
_ => return false,
|
||||
};
|
||||
debug!("note_obligation_cause_for_async_await: target_ty={:?}", target_ty);
|
||||
|
||||
// Attempt to detect an async-await error by looking at the obligation causes, looking
|
||||
// for only generators, generator witnesses, opaque types or `std::future::GenFuture` to
|
||||
// be present.
|
||||
//
|
||||
// When a future does not implement a trait because of a captured type in one of the
|
||||
// generators somewhere in the call stack, then the result is a chain of obligations.
|
||||
// Given a `async fn` A that calls a `async fn` B which captures a non-send type and that
|
||||
// future is passed as an argument to a function C which requires a `Send` type, then the
|
||||
// chain looks something like this:
|
||||
//
|
||||
// - `BuiltinDerivedObligation` with a generator witness (B)
|
||||
// - `BuiltinDerivedObligation` with a generator (B)
|
||||
// - `BuiltinDerivedObligation` with `std::future::GenFuture` (B)
|
||||
// - `BuiltinDerivedObligation` with `impl std::future::Future` (B)
|
||||
// - `BuiltinDerivedObligation` with `impl std::future::Future` (B)
|
||||
// - `BuiltinDerivedObligation` with a generator witness (A)
|
||||
// - `BuiltinDerivedObligation` with a generator (A)
|
||||
// - `BuiltinDerivedObligation` with `std::future::GenFuture` (A)
|
||||
// - `BuiltinDerivedObligation` with `impl std::future::Future` (A)
|
||||
// - `BuiltinDerivedObligation` with `impl std::future::Future` (A)
|
||||
// - `BindingObligation` with `impl_send (Send requirement)
|
||||
//
|
||||
// The first obligations in the chain can be used to get the details of the type that is
|
||||
// captured but the entire chain must be inspected to detect this case.
|
||||
let mut generator = None;
|
||||
let mut next_code = Some(&obligation.cause.code);
|
||||
while let Some(code) = next_code {
|
||||
debug!("note_obligation_cause_for_async_await: code={:?}", code);
|
||||
match code {
|
||||
ObligationCauseCode::BuiltinDerivedObligation(derived_obligation) |
|
||||
ObligationCauseCode::ImplDerivedObligation(derived_obligation) => {
|
||||
debug!("note_obligation_cause_for_async_await: self_ty.kind={:?}",
|
||||
derived_obligation.parent_trait_ref.self_ty().kind);
|
||||
match derived_obligation.parent_trait_ref.self_ty().kind {
|
||||
ty::Adt(ty::AdtDef { did, .. }, ..) if
|
||||
self.tcx.is_diagnostic_item(sym::gen_future, *did) => {},
|
||||
ty::Generator(did, ..) => generator = generator.or(Some(did)),
|
||||
ty::GeneratorWitness(_) | ty::Opaque(..) => {},
|
||||
_ => return false,
|
||||
}
|
||||
|
||||
next_code = Some(derived_obligation.parent_code.as_ref());
|
||||
},
|
||||
ObligationCauseCode::ItemObligation(_) | ObligationCauseCode::BindingObligation(..)
|
||||
if generator.is_some() => break,
|
||||
_ => return false,
|
||||
}
|
||||
}
|
||||
|
||||
let generator_did = generator.expect("can only reach this if there was a generator");
|
||||
|
||||
// Only continue to add a note if the generator is from an `async` function.
|
||||
let parent_node = self.tcx.parent(generator_did)
|
||||
.and_then(|parent_did| self.tcx.hir().get_if_local(parent_did));
|
||||
debug!("note_obligation_cause_for_async_await: parent_node={:?}", parent_node);
|
||||
if let Some(hir::Node::Item(hir::Item {
|
||||
kind: hir::ItemKind::Fn(_, header, _, _),
|
||||
..
|
||||
})) = parent_node {
|
||||
debug!("note_obligation_cause_for_async_await: header={:?}", header);
|
||||
if header.asyncness != hir::IsAsync::Async {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
let span = self.tcx.def_span(generator_did);
|
||||
let tables = self.tcx.typeck_tables_of(generator_did);
|
||||
debug!("note_obligation_cause_for_async_await: generator_did={:?} span={:?} ",
|
||||
generator_did, span);
|
||||
|
||||
// Look for a type inside the generator interior that matches the target type to get
|
||||
// a span.
|
||||
let target_span = tables.generator_interior_types.iter()
|
||||
.find(|ty::GeneratorInteriorTypeCause { ty, .. }| ty::TyS::same_type(*ty, target_ty))
|
||||
.map(|ty::GeneratorInteriorTypeCause { span, scope_span, .. }|
|
||||
(span, source_map.span_to_snippet(*span), scope_span));
|
||||
if let Some((target_span, Ok(snippet), scope_span)) = target_span {
|
||||
// Look at the last interior type to get a span for the `.await`.
|
||||
let await_span = tables.generator_interior_types.iter().map(|i| i.span).last().unwrap();
|
||||
let mut span = MultiSpan::from_span(await_span);
|
||||
span.push_span_label(
|
||||
await_span, format!("await occurs here, with `{}` maybe used later", snippet));
|
||||
|
||||
span.push_span_label(*target_span, format!("has type `{}`", target_ty));
|
||||
|
||||
// If available, use the scope span to annotate the drop location.
|
||||
if let Some(scope_span) = scope_span {
|
||||
span.push_span_label(
|
||||
source_map.end_point(*scope_span),
|
||||
format!("`{}` is later dropped here", snippet),
|
||||
);
|
||||
}
|
||||
|
||||
err.span_note(span, &format!(
|
||||
"future does not implement `{}` as this value is used across an await",
|
||||
trait_ref,
|
||||
));
|
||||
|
||||
// Add a note for the item obligation that remains - normally a note pointing to the
|
||||
// bound that introduced the obligation (e.g. `T: Send`).
|
||||
debug!("note_obligation_cause_for_async_await: next_code={:?}", next_code);
|
||||
self.note_obligation_cause_code(
|
||||
err,
|
||||
&obligation.predicate,
|
||||
next_code.unwrap(),
|
||||
&mut Vec::new(),
|
||||
);
|
||||
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
fn note_obligation_cause_code<T>(&self,
|
||||
|
||||
@@ -288,6 +288,34 @@ pub struct ResolvedOpaqueTy<'tcx> {
|
||||
pub substs: SubstsRef<'tcx>,
|
||||
}
|
||||
|
||||
/// Whenever a value may be live across a generator yield, the type of that value winds up in the
|
||||
/// `GeneratorInteriorTypeCause` struct. This struct adds additional information about such
|
||||
/// captured types that can be useful for diagnostics. In particular, it stores the span that
|
||||
/// caused a given type to be recorded, along with the scope that enclosed the value (which can
|
||||
/// be used to find the await that the value is live across).
|
||||
///
|
||||
/// For example:
|
||||
///
|
||||
/// ```ignore (pseudo-Rust)
|
||||
/// async move {
|
||||
/// let x: T = ...;
|
||||
/// foo.await
|
||||
/// ...
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// Here, we would store the type `T`, the span of the value `x`, and the "scope-span" for
|
||||
/// the scope that contains `x`.
|
||||
#[derive(RustcEncodable, RustcDecodable, Clone, Debug, Eq, Hash, HashStable, PartialEq)]
|
||||
pub struct GeneratorInteriorTypeCause<'tcx> {
|
||||
/// Type of the captured binding.
|
||||
pub ty: Ty<'tcx>,
|
||||
/// Span of the binding that was captured.
|
||||
pub span: Span,
|
||||
/// Span of the scope of the captured binding.
|
||||
pub scope_span: Option<Span>,
|
||||
}
|
||||
|
||||
#[derive(RustcEncodable, RustcDecodable, Debug)]
|
||||
pub struct TypeckTables<'tcx> {
|
||||
/// The HirId::owner all ItemLocalIds in this table are relative to.
|
||||
@@ -397,6 +425,10 @@ pub struct TypeckTables<'tcx> {
|
||||
/// leading to the member of the struct or tuple that is used instead of the
|
||||
/// entire variable.
|
||||
pub upvar_list: ty::UpvarListMap,
|
||||
|
||||
/// Stores the type, span and optional scope span of all types
|
||||
/// that are live across the yield of this generator (if a generator).
|
||||
pub generator_interior_types: Vec<GeneratorInteriorTypeCause<'tcx>>,
|
||||
}
|
||||
|
||||
impl<'tcx> TypeckTables<'tcx> {
|
||||
@@ -422,6 +454,7 @@ pub fn empty(local_id_root: Option<DefId>) -> TypeckTables<'tcx> {
|
||||
free_region_map: Default::default(),
|
||||
concrete_opaque_types: Default::default(),
|
||||
upvar_list: Default::default(),
|
||||
generator_interior_types: Default::default(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -729,6 +762,7 @@ fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHas
|
||||
ref free_region_map,
|
||||
ref concrete_opaque_types,
|
||||
ref upvar_list,
|
||||
ref generator_interior_types,
|
||||
|
||||
} = *self;
|
||||
|
||||
@@ -773,6 +807,7 @@ fn hash_stable(&self, hcx: &mut StableHashingContext<'a>, hasher: &mut StableHas
|
||||
free_region_map.hash_stable(hcx, hasher);
|
||||
concrete_opaque_types.hash_stable(hcx, hasher);
|
||||
upvar_list.hash_stable(hcx, hasher);
|
||||
generator_interior_types.hash_stable(hcx, hasher);
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -75,7 +75,7 @@
|
||||
pub use self::binding::BindingMode::*;
|
||||
|
||||
pub use self::context::{TyCtxt, FreeRegionInfo, AllArenas, tls, keep_local};
|
||||
pub use self::context::{Lift, TypeckTables, CtxtInterners, GlobalCtxt};
|
||||
pub use self::context::{Lift, GeneratorInteriorTypeCause, TypeckTables, CtxtInterners, GlobalCtxt};
|
||||
pub use self::context::{
|
||||
UserTypeAnnotationIndex, UserType, CanonicalUserType,
|
||||
CanonicalUserTypeAnnotation, CanonicalUserTypeAnnotations, ResolvedOpaqueTy,
|
||||
|
||||
+237
-271
@@ -99,8 +99,8 @@ fn new(
|
||||
// ```
|
||||
|
||||
let mut m = Margin {
|
||||
whitespace_left: if whitespace_left >= 6 { whitespace_left - 6 } else { 0 },
|
||||
span_left: if span_left >= 6 { span_left - 6 } else { 0 },
|
||||
whitespace_left: whitespace_left.saturating_sub(6),
|
||||
span_left: span_left.saturating_sub(6),
|
||||
span_right: span_right + 6,
|
||||
computed_left: 0,
|
||||
computed_right: 0,
|
||||
@@ -125,7 +125,7 @@ fn was_cut_right(&self, line_len: usize) -> bool {
|
||||
} else {
|
||||
self.computed_right
|
||||
};
|
||||
right < line_len && line_len > self.computed_left + self.column_width
|
||||
right < line_len && self.computed_left + self.column_width < line_len
|
||||
}
|
||||
|
||||
fn compute(&mut self, max_line_len: usize) {
|
||||
@@ -167,12 +167,10 @@ fn left(&self, line_len: usize) -> usize {
|
||||
}
|
||||
|
||||
fn right(&self, line_len: usize) -> usize {
|
||||
if max(line_len, self.computed_left) - self.computed_left <= self.column_width {
|
||||
line_len
|
||||
} else if self.computed_right > line_len {
|
||||
if line_len.saturating_sub(self.computed_left) <= self.column_width {
|
||||
line_len
|
||||
} else {
|
||||
self.computed_right
|
||||
min(line_len, self.computed_right)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -297,81 +295,82 @@ fn fix_multispan_in_std_macros(&self,
|
||||
source_map: &Option<Lrc<SourceMapperDyn>>,
|
||||
span: &mut MultiSpan,
|
||||
always_backtrace: bool) -> bool {
|
||||
let mut spans_updated = false;
|
||||
let sm = match source_map {
|
||||
Some(ref sm) => sm,
|
||||
None => return false,
|
||||
};
|
||||
|
||||
if let Some(ref sm) = source_map {
|
||||
let mut before_after: Vec<(Span, Span)> = vec![];
|
||||
let mut new_labels: Vec<(Span, String)> = vec![];
|
||||
let mut before_after: Vec<(Span, Span)> = vec![];
|
||||
let mut new_labels: Vec<(Span, String)> = vec![];
|
||||
|
||||
// First, find all the spans in <*macros> and point instead at their use site
|
||||
for sp in span.primary_spans() {
|
||||
if sp.is_dummy() {
|
||||
// First, find all the spans in <*macros> and point instead at their use site
|
||||
for sp in span.primary_spans() {
|
||||
if sp.is_dummy() {
|
||||
continue;
|
||||
}
|
||||
let call_sp = sm.call_span_if_macro(*sp);
|
||||
if call_sp != *sp && !always_backtrace {
|
||||
before_after.push((*sp, call_sp));
|
||||
}
|
||||
let backtrace_len = sp.macro_backtrace().len();
|
||||
for (i, trace) in sp.macro_backtrace().iter().rev().enumerate() {
|
||||
// Only show macro locations that are local
|
||||
// and display them like a span_note
|
||||
if trace.def_site_span.is_dummy() {
|
||||
continue;
|
||||
}
|
||||
let call_sp = sm.call_span_if_macro(*sp);
|
||||
if call_sp != *sp && !always_backtrace {
|
||||
before_after.push((*sp, call_sp));
|
||||
if always_backtrace {
|
||||
new_labels.push((trace.def_site_span,
|
||||
format!("in this expansion of `{}`{}",
|
||||
trace.macro_decl_name,
|
||||
if backtrace_len > 2 {
|
||||
// if backtrace_len == 1 it'll be pointed
|
||||
// at by "in this macro invocation"
|
||||
format!(" (#{})", i + 1)
|
||||
} else {
|
||||
String::new()
|
||||
})));
|
||||
}
|
||||
let backtrace_len = sp.macro_backtrace().len();
|
||||
for (i, trace) in sp.macro_backtrace().iter().rev().enumerate() {
|
||||
// Only show macro locations that are local
|
||||
// and display them like a span_note
|
||||
if trace.def_site_span.is_dummy() {
|
||||
continue;
|
||||
}
|
||||
if always_backtrace {
|
||||
new_labels.push((trace.def_site_span,
|
||||
format!("in this expansion of `{}`{}",
|
||||
trace.macro_decl_name,
|
||||
if backtrace_len > 2 {
|
||||
// if backtrace_len == 1 it'll be pointed
|
||||
// at by "in this macro invocation"
|
||||
format!(" (#{})", i + 1)
|
||||
} else {
|
||||
String::new()
|
||||
})));
|
||||
}
|
||||
// Check to make sure we're not in any <*macros>
|
||||
if !sm.span_to_filename(trace.def_site_span).is_macros() &&
|
||||
!trace.macro_decl_name.starts_with("desugaring of ") &&
|
||||
!trace.macro_decl_name.starts_with("#[") ||
|
||||
always_backtrace {
|
||||
new_labels.push((trace.call_site,
|
||||
format!("in this macro invocation{}",
|
||||
if backtrace_len > 2 && always_backtrace {
|
||||
// only specify order when the macro
|
||||
// backtrace is multiple levels deep
|
||||
format!(" (#{})", i + 1)
|
||||
} else {
|
||||
String::new()
|
||||
})));
|
||||
if !always_backtrace {
|
||||
break;
|
||||
}
|
||||
// Check to make sure we're not in any <*macros>
|
||||
if !sm.span_to_filename(trace.def_site_span).is_macros() &&
|
||||
!trace.macro_decl_name.starts_with("desugaring of ") &&
|
||||
!trace.macro_decl_name.starts_with("#[") ||
|
||||
always_backtrace {
|
||||
new_labels.push((trace.call_site,
|
||||
format!("in this macro invocation{}",
|
||||
if backtrace_len > 2 && always_backtrace {
|
||||
// only specify order when the macro
|
||||
// backtrace is multiple levels deep
|
||||
format!(" (#{})", i + 1)
|
||||
} else {
|
||||
String::new()
|
||||
})));
|
||||
if !always_backtrace {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
for (label_span, label_text) in new_labels {
|
||||
span.push_span_label(label_span, label_text);
|
||||
}
|
||||
for (label_span, label_text) in new_labels {
|
||||
span.push_span_label(label_span, label_text);
|
||||
}
|
||||
for sp_label in span.span_labels() {
|
||||
if sp_label.span.is_dummy() {
|
||||
continue;
|
||||
}
|
||||
for sp_label in span.span_labels() {
|
||||
if sp_label.span.is_dummy() {
|
||||
continue;
|
||||
}
|
||||
if sm.span_to_filename(sp_label.span.clone()).is_macros() &&
|
||||
!always_backtrace
|
||||
{
|
||||
let v = sp_label.span.macro_backtrace();
|
||||
if let Some(use_site) = v.last() {
|
||||
before_after.push((sp_label.span.clone(), use_site.call_site.clone()));
|
||||
}
|
||||
if sm.span_to_filename(sp_label.span.clone()).is_macros() &&
|
||||
!always_backtrace
|
||||
{
|
||||
let v = sp_label.span.macro_backtrace();
|
||||
if let Some(use_site) = v.last() {
|
||||
before_after.push((sp_label.span.clone(), use_site.call_site.clone()));
|
||||
}
|
||||
}
|
||||
// After we have them, make sure we replace these 'bad' def sites with their use sites
|
||||
for (before, after) in before_after {
|
||||
span.replace(before, after);
|
||||
spans_updated = true;
|
||||
}
|
||||
}
|
||||
// After we have them, make sure we replace these 'bad' def sites with their use sites
|
||||
let spans_updated = !before_after.is_empty();
|
||||
for (before, after) in before_after {
|
||||
span.replace(before, after);
|
||||
}
|
||||
|
||||
spans_updated
|
||||
@@ -593,9 +592,9 @@ fn render_source_line(
|
||||
|
||||
let left = margin.left(source_string.len()); // Left trim
|
||||
// Account for unicode characters of width !=0 that were removed.
|
||||
let left = source_string.chars().take(left).fold(0, |acc, ch| {
|
||||
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1)
|
||||
});
|
||||
let left = source_string.chars().take(left)
|
||||
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
|
||||
.sum();
|
||||
|
||||
self.draw_line(
|
||||
buffer,
|
||||
@@ -623,18 +622,16 @@ fn render_source_line(
|
||||
// 3 | |
|
||||
// 4 | | }
|
||||
// | |_^ test
|
||||
if line.annotations.len() == 1 {
|
||||
if let Some(ref ann) = line.annotations.get(0) {
|
||||
if let AnnotationType::MultilineStart(depth) = ann.annotation_type {
|
||||
if source_string.chars().take(ann.start_col).all(|c| c.is_whitespace()) {
|
||||
let style = if ann.is_primary {
|
||||
Style::UnderlinePrimary
|
||||
} else {
|
||||
Style::UnderlineSecondary
|
||||
};
|
||||
buffer.putc(line_offset, width_offset + depth - 1, '/', style);
|
||||
return vec![(depth, style)];
|
||||
}
|
||||
if let [ann] = &line.annotations[..] {
|
||||
if let AnnotationType::MultilineStart(depth) = ann.annotation_type {
|
||||
if source_string.chars().take(ann.start_col).all(|c| c.is_whitespace()) {
|
||||
let style = if ann.is_primary {
|
||||
Style::UnderlinePrimary
|
||||
} else {
|
||||
Style::UnderlineSecondary
|
||||
};
|
||||
buffer.putc(line_offset, width_offset + depth - 1, '/', style);
|
||||
return vec![(depth, style)];
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -763,11 +760,7 @@ fn render_source_line(
|
||||
annotations_position.push((p, annotation));
|
||||
for (j, next) in annotations.iter().enumerate() {
|
||||
if j > i {
|
||||
let l = if let Some(ref label) = next.label {
|
||||
label.len() + 2
|
||||
} else {
|
||||
0
|
||||
};
|
||||
let l = next.label.as_ref().map_or(0, |label| label.len() + 2);
|
||||
if (overlaps(next, annotation, l) // Do not allow two labels to be in the same
|
||||
// line if they overlap including padding, to
|
||||
// avoid situations like:
|
||||
@@ -797,9 +790,7 @@ fn render_source_line(
|
||||
}
|
||||
}
|
||||
}
|
||||
if line_len < p {
|
||||
line_len = p;
|
||||
}
|
||||
line_len = max(line_len, p);
|
||||
}
|
||||
|
||||
if line_len != 0 {
|
||||
@@ -941,17 +932,9 @@ fn render_source_line(
|
||||
Style::LabelSecondary
|
||||
};
|
||||
let (pos, col) = if pos == 0 {
|
||||
(pos + 1, if annotation.end_col + 1 > left {
|
||||
annotation.end_col + 1 - left
|
||||
} else {
|
||||
0
|
||||
})
|
||||
(pos + 1, (annotation.end_col + 1).saturating_sub(left))
|
||||
} else {
|
||||
(pos + 2, if annotation.start_col > left {
|
||||
annotation.start_col - left
|
||||
} else {
|
||||
0
|
||||
})
|
||||
(pos + 2, annotation.start_col.saturating_sub(left))
|
||||
};
|
||||
if let Some(ref label) = annotation.label {
|
||||
buffer.puts(line_offset + pos, code_offset + col, &label, style);
|
||||
@@ -966,9 +949,9 @@ fn render_source_line(
|
||||
// | | |
|
||||
// | | something about `foo`
|
||||
// | something about `fn foo()`
|
||||
annotations_position.sort_by(|a, b| {
|
||||
// Decreasing order. When `a` and `b` are the same length, prefer `Primary`.
|
||||
(a.1.len(), !a.1.is_primary).cmp(&(b.1.len(), !b.1.is_primary)).reverse()
|
||||
annotations_position.sort_by_key(|(_, ann)| {
|
||||
// Decreasing order. When annotations share the same length, prefer `Primary`.
|
||||
(Reverse(ann.len()), ann.is_primary)
|
||||
});
|
||||
|
||||
// Write the underlines.
|
||||
@@ -991,11 +974,7 @@ fn render_source_line(
|
||||
for p in annotation.start_col..annotation.end_col {
|
||||
buffer.putc(
|
||||
line_offset + 1,
|
||||
if code_offset + p > left {
|
||||
code_offset + p - left
|
||||
} else {
|
||||
0
|
||||
},
|
||||
(code_offset + p).saturating_sub(left),
|
||||
underline,
|
||||
style,
|
||||
);
|
||||
@@ -1018,40 +997,36 @@ fn render_source_line(
|
||||
}
|
||||
|
||||
fn get_multispan_max_line_num(&mut self, msp: &MultiSpan) -> usize {
|
||||
let sm = match self.sm {
|
||||
Some(ref sm) => sm,
|
||||
None => return 0,
|
||||
};
|
||||
|
||||
let mut max = 0;
|
||||
if let Some(ref sm) = self.sm {
|
||||
for primary_span in msp.primary_spans() {
|
||||
if !primary_span.is_dummy() {
|
||||
let hi = sm.lookup_char_pos(primary_span.hi());
|
||||
if hi.line > max {
|
||||
max = hi.line;
|
||||
}
|
||||
}
|
||||
for primary_span in msp.primary_spans() {
|
||||
if !primary_span.is_dummy() {
|
||||
let hi = sm.lookup_char_pos(primary_span.hi());
|
||||
max = (hi.line).max(max);
|
||||
}
|
||||
if !self.short_message {
|
||||
for span_label in msp.span_labels() {
|
||||
if !span_label.span.is_dummy() {
|
||||
let hi = sm.lookup_char_pos(span_label.span.hi());
|
||||
if hi.line > max {
|
||||
max = hi.line;
|
||||
}
|
||||
}
|
||||
}
|
||||
if !self.short_message {
|
||||
for span_label in msp.span_labels() {
|
||||
if !span_label.span.is_dummy() {
|
||||
let hi = sm.lookup_char_pos(span_label.span.hi());
|
||||
max = (hi.line).max(max);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
max
|
||||
}
|
||||
|
||||
fn get_max_line_num(&mut self, span: &MultiSpan, children: &[SubDiagnostic]) -> usize {
|
||||
|
||||
let primary = self.get_multispan_max_line_num(span);
|
||||
let mut max = primary;
|
||||
|
||||
for sub in children {
|
||||
let sub_result = self.get_multispan_max_line_num(&sub.span);
|
||||
max = std::cmp::max(sub_result, max);
|
||||
}
|
||||
max
|
||||
children.iter()
|
||||
.map(|sub| self.get_multispan_max_line_num(&sub.span))
|
||||
.max()
|
||||
.unwrap_or(primary)
|
||||
}
|
||||
|
||||
/// Adds a left margin to every line but the first, given a padding length and the label being
|
||||
@@ -1081,14 +1056,12 @@ fn msg_to_buffer(&self,
|
||||
// `max_line_num_len`
|
||||
let padding = " ".repeat(padding + label.len() + 5);
|
||||
|
||||
/// Returns `true` if `style`, or the override if present and the style is `NoStyle`.
|
||||
fn style_or_override(style: Style, override_style: Option<Style>) -> Style {
|
||||
if let Some(o) = override_style {
|
||||
if style == Style::NoStyle {
|
||||
return o;
|
||||
}
|
||||
/// Returns `override` if it is present and `style` is `NoStyle` or `style` otherwise
|
||||
fn style_or_override(style: Style, override_: Option<Style>) -> Style {
|
||||
match (style, override_) {
|
||||
(Style::NoStyle, Some(override_)) => override_,
|
||||
_ => style,
|
||||
}
|
||||
style
|
||||
}
|
||||
|
||||
let mut line_number = 0;
|
||||
@@ -1324,13 +1297,12 @@ fn emit_message_default(
|
||||
for line in &annotated_file.lines {
|
||||
max_line_len = max(max_line_len, annotated_file.file
|
||||
.get_line(line.line_index - 1)
|
||||
.map(|s| s.len())
|
||||
.unwrap_or(0));
|
||||
.map_or(0, |s| s.len()));
|
||||
for ann in &line.annotations {
|
||||
span_right_margin = max(span_right_margin, ann.start_col);
|
||||
span_right_margin = max(span_right_margin, ann.end_col);
|
||||
// FIXME: account for labels not in the same line
|
||||
let label_right = ann.label.as_ref().map(|l| l.len() + 1).unwrap_or(0);
|
||||
let label_right = ann.label.as_ref().map_or(0, |l| l.len() + 1);
|
||||
label_right_margin = max(label_right_margin, ann.end_col + label_right);
|
||||
}
|
||||
}
|
||||
@@ -1459,122 +1431,125 @@ fn emit_suggestion_default(
|
||||
level: &Level,
|
||||
max_line_num_len: usize,
|
||||
) -> io::Result<()> {
|
||||
if let Some(ref sm) = self.sm {
|
||||
let mut buffer = StyledBuffer::new();
|
||||
let sm = match self.sm {
|
||||
Some(ref sm) => sm,
|
||||
None => return Ok(())
|
||||
};
|
||||
|
||||
// Render the suggestion message
|
||||
let level_str = level.to_string();
|
||||
if !level_str.is_empty() {
|
||||
buffer.append(0, &level_str, Style::Level(level.clone()));
|
||||
buffer.append(0, ": ", Style::HeaderMsg);
|
||||
let mut buffer = StyledBuffer::new();
|
||||
|
||||
// Render the suggestion message
|
||||
let level_str = level.to_string();
|
||||
if !level_str.is_empty() {
|
||||
buffer.append(0, &level_str, Style::Level(level.clone()));
|
||||
buffer.append(0, ": ", Style::HeaderMsg);
|
||||
}
|
||||
self.msg_to_buffer(
|
||||
&mut buffer,
|
||||
&[(suggestion.msg.to_owned(), Style::NoStyle)],
|
||||
max_line_num_len,
|
||||
"suggestion",
|
||||
Some(Style::HeaderMsg),
|
||||
);
|
||||
|
||||
// Render the replacements for each suggestion
|
||||
let suggestions = suggestion.splice_lines(&**sm);
|
||||
|
||||
let mut row_num = 2;
|
||||
for &(ref complete, ref parts) in suggestions.iter().take(MAX_SUGGESTIONS) {
|
||||
// Only show underline if the suggestion spans a single line and doesn't cover the
|
||||
// entirety of the code output. If you have multiple replacements in the same line
|
||||
// of code, show the underline.
|
||||
let show_underline = !(parts.len() == 1
|
||||
&& parts[0].snippet.trim() == complete.trim())
|
||||
&& complete.lines().count() == 1;
|
||||
|
||||
let lines = sm.span_to_lines(parts[0].span).unwrap();
|
||||
|
||||
assert!(!lines.lines.is_empty());
|
||||
|
||||
let line_start = sm.lookup_char_pos(parts[0].span.lo()).line;
|
||||
draw_col_separator_no_space(&mut buffer, 1, max_line_num_len + 1);
|
||||
let mut line_pos = 0;
|
||||
let mut lines = complete.lines();
|
||||
for line in lines.by_ref().take(MAX_HIGHLIGHT_LINES) {
|
||||
// Print the span column to avoid confusion
|
||||
buffer.puts(row_num,
|
||||
0,
|
||||
&self.maybe_anonymized(line_start + line_pos),
|
||||
Style::LineNumber);
|
||||
// print the suggestion
|
||||
draw_col_separator(&mut buffer, row_num, max_line_num_len + 1);
|
||||
buffer.append(row_num, line, Style::NoStyle);
|
||||
line_pos += 1;
|
||||
row_num += 1;
|
||||
}
|
||||
self.msg_to_buffer(
|
||||
&mut buffer,
|
||||
&[(suggestion.msg.to_owned(), Style::NoStyle)],
|
||||
max_line_num_len,
|
||||
"suggestion",
|
||||
Some(Style::HeaderMsg),
|
||||
);
|
||||
|
||||
// Render the replacements for each suggestion
|
||||
let suggestions = suggestion.splice_lines(&**sm);
|
||||
// This offset and the ones below need to be signed to account for replacement code
|
||||
// that is shorter than the original code.
|
||||
let mut offset: isize = 0;
|
||||
// Only show an underline in the suggestions if the suggestion is not the
|
||||
// entirety of the code being shown and the displayed code is not multiline.
|
||||
if show_underline {
|
||||
draw_col_separator(&mut buffer, row_num, max_line_num_len + 1);
|
||||
for part in parts {
|
||||
let span_start_pos = sm.lookup_char_pos(part.span.lo()).col_display;
|
||||
let span_end_pos = sm.lookup_char_pos(part.span.hi()).col_display;
|
||||
|
||||
let mut row_num = 2;
|
||||
for &(ref complete, ref parts) in suggestions.iter().take(MAX_SUGGESTIONS) {
|
||||
// Only show underline if the suggestion spans a single line and doesn't cover the
|
||||
// entirety of the code output. If you have multiple replacements in the same line
|
||||
// of code, show the underline.
|
||||
let show_underline = !(parts.len() == 1
|
||||
&& parts[0].snippet.trim() == complete.trim())
|
||||
&& complete.lines().count() == 1;
|
||||
// Do not underline the leading...
|
||||
let start = part.snippet.len()
|
||||
.saturating_sub(part.snippet.trim_start().len());
|
||||
// ...or trailing spaces. Account for substitutions containing unicode
|
||||
// characters.
|
||||
let sub_len: usize = part.snippet.trim().chars()
|
||||
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
|
||||
.sum();
|
||||
|
||||
let lines = sm.span_to_lines(parts[0].span).unwrap();
|
||||
|
||||
assert!(!lines.lines.is_empty());
|
||||
|
||||
let line_start = sm.lookup_char_pos(parts[0].span.lo()).line;
|
||||
draw_col_separator_no_space(&mut buffer, 1, max_line_num_len + 1);
|
||||
let mut line_pos = 0;
|
||||
let mut lines = complete.lines();
|
||||
for line in lines.by_ref().take(MAX_HIGHLIGHT_LINES) {
|
||||
// Print the span column to avoid confusion
|
||||
buffer.puts(row_num,
|
||||
0,
|
||||
&self.maybe_anonymized(line_start + line_pos),
|
||||
Style::LineNumber);
|
||||
// print the suggestion
|
||||
draw_col_separator(&mut buffer, row_num, max_line_num_len + 1);
|
||||
buffer.append(row_num, line, Style::NoStyle);
|
||||
line_pos += 1;
|
||||
row_num += 1;
|
||||
}
|
||||
|
||||
// This offset and the ones below need to be signed to account for replacement code
|
||||
// that is shorter than the original code.
|
||||
let mut offset: isize = 0;
|
||||
// Only show an underline in the suggestions if the suggestion is not the
|
||||
// entirety of the code being shown and the displayed code is not multiline.
|
||||
if show_underline {
|
||||
draw_col_separator(&mut buffer, row_num, max_line_num_len + 1);
|
||||
for part in parts {
|
||||
let span_start_pos = sm.lookup_char_pos(part.span.lo()).col_display;
|
||||
let span_end_pos = sm.lookup_char_pos(part.span.hi()).col_display;
|
||||
|
||||
// Do not underline the leading...
|
||||
let start = part.snippet.len()
|
||||
.saturating_sub(part.snippet.trim_start().len());
|
||||
// ...or trailing spaces. Account for substitutions containing unicode
|
||||
// characters.
|
||||
let sub_len = part.snippet.trim().chars().fold(0, |acc, ch| {
|
||||
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1)
|
||||
});
|
||||
|
||||
let underline_start = (span_start_pos + start) as isize + offset;
|
||||
let underline_end = (span_start_pos + start + sub_len) as isize + offset;
|
||||
for p in underline_start..underline_end {
|
||||
let underline_start = (span_start_pos + start) as isize + offset;
|
||||
let underline_end = (span_start_pos + start + sub_len) as isize + offset;
|
||||
for p in underline_start..underline_end {
|
||||
buffer.putc(row_num,
|
||||
max_line_num_len + 3 + p as usize,
|
||||
'^',
|
||||
Style::UnderlinePrimary);
|
||||
}
|
||||
// underline removals too
|
||||
if underline_start == underline_end {
|
||||
for p in underline_start-1..underline_start+1 {
|
||||
buffer.putc(row_num,
|
||||
max_line_num_len + 3 + p as usize,
|
||||
'^',
|
||||
Style::UnderlinePrimary);
|
||||
'-',
|
||||
Style::UnderlineSecondary);
|
||||
}
|
||||
// underline removals too
|
||||
if underline_start == underline_end {
|
||||
for p in underline_start-1..underline_start+1 {
|
||||
buffer.putc(row_num,
|
||||
max_line_num_len + 3 + p as usize,
|
||||
'-',
|
||||
Style::UnderlineSecondary);
|
||||
}
|
||||
}
|
||||
|
||||
// length of the code after substitution
|
||||
let full_sub_len = part.snippet.chars().fold(0, |acc, ch| {
|
||||
acc + unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1) as isize
|
||||
});
|
||||
|
||||
// length of the code to be substituted
|
||||
let snippet_len = span_end_pos as isize - span_start_pos as isize;
|
||||
// For multiple substitutions, use the position *after* the previous
|
||||
// substitutions have happened.
|
||||
offset += full_sub_len - snippet_len;
|
||||
}
|
||||
row_num += 1;
|
||||
}
|
||||
|
||||
// if we elided some lines, add an ellipsis
|
||||
if lines.next().is_some() {
|
||||
buffer.puts(row_num, max_line_num_len - 1, "...", Style::LineNumber);
|
||||
} else if !show_underline {
|
||||
draw_col_separator_no_space(&mut buffer, row_num, max_line_num_len + 1);
|
||||
row_num += 1;
|
||||
// length of the code after substitution
|
||||
let full_sub_len = part.snippet.chars()
|
||||
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
|
||||
.sum::<usize>() as isize;
|
||||
|
||||
// length of the code to be substituted
|
||||
let snippet_len = span_end_pos as isize - span_start_pos as isize;
|
||||
// For multiple substitutions, use the position *after* the previous
|
||||
// substitutions have happened.
|
||||
offset += full_sub_len - snippet_len;
|
||||
}
|
||||
row_num += 1;
|
||||
}
|
||||
if suggestions.len() > MAX_SUGGESTIONS {
|
||||
let msg = format!("and {} other candidates", suggestions.len() - MAX_SUGGESTIONS);
|
||||
buffer.puts(row_num, 0, &msg, Style::NoStyle);
|
||||
|
||||
// if we elided some lines, add an ellipsis
|
||||
if lines.next().is_some() {
|
||||
buffer.puts(row_num, max_line_num_len - 1, "...", Style::LineNumber);
|
||||
} else if !show_underline {
|
||||
draw_col_separator_no_space(&mut buffer, row_num, max_line_num_len + 1);
|
||||
row_num += 1;
|
||||
}
|
||||
emit_to_destination(&buffer.render(), level, &mut self.dst, self.short_message)?;
|
||||
}
|
||||
if suggestions.len() > MAX_SUGGESTIONS {
|
||||
let msg = format!("and {} other candidates", suggestions.len() - MAX_SUGGESTIONS);
|
||||
buffer.puts(row_num, 0, &msg, Style::NoStyle);
|
||||
}
|
||||
emit_to_destination(&buffer.render(), level, &mut self.dst, self.short_message)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -1732,7 +1707,7 @@ fn add_annotation_to_file(file_vec: &mut Vec<FileWithAnnotatedLines>,
|
||||
hi.col_display += 1;
|
||||
}
|
||||
|
||||
let ann_type = if lo.line != hi.line {
|
||||
if lo.line != hi.line {
|
||||
let ml = MultilineAnnotation {
|
||||
depth: 1,
|
||||
line_start: lo.line,
|
||||
@@ -1740,34 +1715,27 @@ fn add_annotation_to_file(file_vec: &mut Vec<FileWithAnnotatedLines>,
|
||||
start_col: lo.col_display,
|
||||
end_col: hi.col_display,
|
||||
is_primary: span_label.is_primary,
|
||||
label: span_label.label.clone(),
|
||||
label: span_label.label,
|
||||
overlaps_exactly: false,
|
||||
};
|
||||
multiline_annotations.push((lo.file.clone(), ml.clone()));
|
||||
AnnotationType::Multiline(ml)
|
||||
multiline_annotations.push((lo.file, ml));
|
||||
} else {
|
||||
AnnotationType::Singleline
|
||||
};
|
||||
let ann = Annotation {
|
||||
start_col: lo.col_display,
|
||||
end_col: hi.col_display,
|
||||
is_primary: span_label.is_primary,
|
||||
label: span_label.label.clone(),
|
||||
annotation_type: ann_type,
|
||||
};
|
||||
|
||||
if !ann.is_multiline() {
|
||||
let ann = Annotation {
|
||||
start_col: lo.col_display,
|
||||
end_col: hi.col_display,
|
||||
is_primary: span_label.is_primary,
|
||||
label: span_label.label,
|
||||
annotation_type: AnnotationType::Singleline,
|
||||
};
|
||||
add_annotation_to_file(&mut output, lo.file, lo.line, ann);
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Find overlapping multiline annotations, put them at different depths
|
||||
multiline_annotations.sort_by_key(|&(_, ref ml)| (ml.line_start, ml.line_end));
|
||||
for item in multiline_annotations.clone() {
|
||||
let ann = item.1;
|
||||
for item in multiline_annotations.iter_mut() {
|
||||
let ref mut a = item.1;
|
||||
for (_, ann) in multiline_annotations.clone() {
|
||||
for (_, a) in multiline_annotations.iter_mut() {
|
||||
// Move all other multiline annotations overlapping with this one
|
||||
// one level to the right.
|
||||
if !(ann.same_span(a)) &&
|
||||
@@ -1784,9 +1752,7 @@ fn add_annotation_to_file(file_vec: &mut Vec<FileWithAnnotatedLines>,
|
||||
|
||||
let mut max_depth = 0; // max overlapping multiline spans
|
||||
for (file, ann) in multiline_annotations {
|
||||
if ann.depth > max_depth {
|
||||
max_depth = ann.depth;
|
||||
}
|
||||
max_depth = max(max_depth, ann.depth);
|
||||
let mut end_ann = ann.as_end();
|
||||
if !ann.overlaps_exactly {
|
||||
// avoid output like
|
||||
|
||||
@@ -247,7 +247,7 @@ pub fn register_plugins<'a>(
|
||||
rustc_incremental::prepare_session_directory(sess, &crate_name, disambiguator);
|
||||
|
||||
if sess.opts.incremental.is_some() {
|
||||
time(sess, "garbage collect incremental cache directory", || {
|
||||
time(sess, "garbage-collect incremental cache directory", || {
|
||||
if let Err(e) = rustc_incremental::garbage_collect_session_directories(sess) {
|
||||
warn!(
|
||||
"Error while trying to garbage collect incremental \
|
||||
@@ -318,7 +318,7 @@ fn configure_and_expand_inner<'a>(
|
||||
crate_loader: &'a mut CrateLoader<'a>,
|
||||
plugin_info: PluginInfo,
|
||||
) -> Result<(ast::Crate, Resolver<'a>)> {
|
||||
time(sess, "pre ast expansion lint checks", || {
|
||||
time(sess, "pre-AST-expansion lint checks", || {
|
||||
lint::check_ast_crate(
|
||||
sess,
|
||||
&krate,
|
||||
@@ -536,8 +536,8 @@ pub fn lower_to_hir(
|
||||
dep_graph: &DepGraph,
|
||||
krate: &ast::Crate,
|
||||
) -> Result<hir::map::Forest> {
|
||||
// Lower ast -> hir
|
||||
let hir_forest = time(sess, "lowering ast -> hir", || {
|
||||
// Lower AST to HIR.
|
||||
let hir_forest = time(sess, "lowering AST -> HIR", || {
|
||||
let hir_crate = lower_crate(sess, cstore, &dep_graph, &krate, resolver);
|
||||
|
||||
if sess.opts.debugging_opts.hir_stats {
|
||||
@@ -757,7 +757,7 @@ pub fn prepare_outputs(
|
||||
if !only_dep_info {
|
||||
if let Some(ref dir) = compiler.output_dir {
|
||||
if fs::create_dir_all(dir).is_err() {
|
||||
sess.err("failed to find or create the directory specified by --out-dir");
|
||||
sess.err("failed to find or create the directory specified by `--out-dir`");
|
||||
return Err(ErrorReported);
|
||||
}
|
||||
}
|
||||
@@ -830,8 +830,8 @@ pub fn create_global_ctxt(
|
||||
let global_ctxt: Option<GlobalCtxt<'_>>;
|
||||
let arenas = AllArenas::new();
|
||||
|
||||
// Construct the HIR map
|
||||
let hir_map = time(sess, "indexing hir", || {
|
||||
// Construct the HIR map.
|
||||
let hir_map = time(sess, "indexing HIR", || {
|
||||
hir::map::map_crate(sess, cstore, &mut hir_forest, &defs)
|
||||
});
|
||||
|
||||
@@ -942,7 +942,7 @@ fn analysis(tcx: TyCtxt<'_>, cnum: CrateNum) -> Result<()> {
|
||||
tcx.par_body_owners(|def_id| tcx.ensure().mir_borrowck(def_id));
|
||||
});
|
||||
|
||||
time(sess, "dumping chalk-like clauses", || {
|
||||
time(sess, "dumping Chalk-like clauses", || {
|
||||
rustc_traits::lowering::dump_program_clauses(tcx);
|
||||
});
|
||||
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
#![feature(proc_macro_hygiene)]
|
||||
#![allow(rustc::default_hash_types)]
|
||||
|
||||
#![recursion_limit="128"]
|
||||
|
||||
@@ -952,8 +952,8 @@ pub fn is_unsized(&self, ast_bounds: &[hir::GenericBound], span: Span) -> bool {
|
||||
tcx.sess.span_warn(
|
||||
span,
|
||||
"default bound relaxed for a type parameter, but \
|
||||
this does nothing because the given bound is not \
|
||||
a default. Only `?Sized` is supported",
|
||||
this does nothing because the given bound is not \
|
||||
a default; only `?Sized` is supported",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
|
||||
struct InteriorVisitor<'a, 'tcx> {
|
||||
fcx: &'a FnCtxt<'a, 'tcx>,
|
||||
types: FxHashMap<Ty<'tcx>, usize>,
|
||||
types: FxHashMap<ty::GeneratorInteriorTypeCause<'tcx>, usize>,
|
||||
region_scope_tree: &'tcx region::ScopeTree,
|
||||
expr_count: usize,
|
||||
kind: hir::GeneratorKind,
|
||||
@@ -83,7 +83,12 @@ fn record(&mut self,
|
||||
} else {
|
||||
// Map the type to the number of types added before it
|
||||
let entries = self.types.len();
|
||||
self.types.entry(&ty).or_insert(entries);
|
||||
let scope_span = scope.map(|s| s.span(self.fcx.tcx, self.region_scope_tree));
|
||||
self.types.entry(ty::GeneratorInteriorTypeCause {
|
||||
span: source_span,
|
||||
ty: &ty,
|
||||
scope_span
|
||||
}).or_insert(entries);
|
||||
}
|
||||
} else {
|
||||
debug!("no type in expr = {:?}, count = {:?}, span = {:?}",
|
||||
@@ -118,8 +123,12 @@ pub fn resolve_interior<'a, 'tcx>(
|
||||
// Sort types by insertion order
|
||||
types.sort_by_key(|t| t.1);
|
||||
|
||||
// Store the generator types and spans into the tables for this generator.
|
||||
let interior_types = types.iter().cloned().map(|t| t.0).collect::<Vec<_>>();
|
||||
visitor.fcx.inh.tables.borrow_mut().generator_interior_types = interior_types;
|
||||
|
||||
// Extract type components
|
||||
let type_list = fcx.tcx.mk_type_list(types.into_iter().map(|t| t.0));
|
||||
let type_list = fcx.tcx.mk_type_list(types.into_iter().map(|t| (t.0).ty));
|
||||
|
||||
// The types in the generator interior contain lifetimes local to the generator itself,
|
||||
// which should not be exposed outside of the generator. Therefore, we replace these
|
||||
|
||||
@@ -631,26 +631,30 @@ fn suggest_use_candidates(&self,
|
||||
}
|
||||
}
|
||||
|
||||
fn suggest_valid_traits(&self,
|
||||
err: &mut DiagnosticBuilder<'_>,
|
||||
valid_out_of_scope_traits: Vec<DefId>) -> bool {
|
||||
fn suggest_valid_traits(
|
||||
&self,
|
||||
err: &mut DiagnosticBuilder<'_>,
|
||||
valid_out_of_scope_traits: Vec<DefId>,
|
||||
) -> bool {
|
||||
if !valid_out_of_scope_traits.is_empty() {
|
||||
let mut candidates = valid_out_of_scope_traits;
|
||||
candidates.sort();
|
||||
candidates.dedup();
|
||||
err.help("items from traits can only be used if the trait is in scope");
|
||||
let msg = format!("the following {traits_are} implemented but not in scope, \
|
||||
perhaps add a `use` for {one_of_them}:",
|
||||
traits_are = if candidates.len() == 1 {
|
||||
"trait is"
|
||||
} else {
|
||||
"traits are"
|
||||
},
|
||||
one_of_them = if candidates.len() == 1 {
|
||||
"it"
|
||||
} else {
|
||||
"one of them"
|
||||
});
|
||||
let msg = format!(
|
||||
"the following {traits_are} implemented but not in scope; \
|
||||
perhaps add a `use` for {one_of_them}:",
|
||||
traits_are = if candidates.len() == 1 {
|
||||
"trait is"
|
||||
} else {
|
||||
"traits are"
|
||||
},
|
||||
one_of_them = if candidates.len() == 1 {
|
||||
"it"
|
||||
} else {
|
||||
"one of them"
|
||||
},
|
||||
);
|
||||
|
||||
self.suggest_use_candidates(err, msg, candidates);
|
||||
true
|
||||
|
||||
@@ -2364,7 +2364,8 @@ fn warn_if_unreachable(&self, id: hir::HirId, span: Span, kind: &str) {
|
||||
// which diverges, that we are about to lint on. This gives suboptimal diagnostics.
|
||||
// Instead, stop here so that the `if`- or `while`-expression's block is linted instead.
|
||||
if !span.is_desugaring(DesugaringKind::CondTemporary) &&
|
||||
!span.is_desugaring(DesugaringKind::Async)
|
||||
!span.is_desugaring(DesugaringKind::Async) &&
|
||||
!orig_span.is_desugaring(DesugaringKind::Await)
|
||||
{
|
||||
self.diverges.set(Diverges::WarnedAlways);
|
||||
|
||||
|
||||
@@ -58,6 +58,7 @@ pub fn resolve_type_vars_in_body(&self, body: &'tcx hir::Body) -> &'tcx ty::Type
|
||||
wbcx.visit_free_region_map();
|
||||
wbcx.visit_user_provided_tys();
|
||||
wbcx.visit_user_provided_sigs();
|
||||
wbcx.visit_generator_interior_types();
|
||||
|
||||
let used_trait_imports = mem::replace(
|
||||
&mut self.tables.borrow_mut().used_trait_imports,
|
||||
@@ -430,6 +431,12 @@ fn visit_user_provided_sigs(&mut self) {
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_generator_interior_types(&mut self) {
|
||||
let fcx_tables = self.fcx.tables.borrow();
|
||||
debug_assert_eq!(fcx_tables.local_id_root, self.tables.local_id_root);
|
||||
self.tables.generator_interior_types = fcx_tables.generator_interior_types.clone();
|
||||
}
|
||||
|
||||
fn visit_opaque_types(&mut self, span: Span) {
|
||||
for (&def_id, opaque_defn) in self.fcx.opaque_types.borrow().iter() {
|
||||
let hir_id = self.tcx().hir().as_local_hir_id(def_id).unwrap();
|
||||
|
||||
@@ -26,6 +26,7 @@ pub fn from_generator<T: Generator<Yield = ()>>(x: T) -> impl Future<Output = T:
|
||||
#[doc(hidden)]
|
||||
#[unstable(feature = "gen_future", issue = "50547")]
|
||||
#[derive(Copy, Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash)]
|
||||
#[cfg_attr(not(test), rustc_diagnostic_item = "gen_future")]
|
||||
struct GenFuture<T: Generator<Yield = ()>>(T);
|
||||
|
||||
// We rely on the fact that async/await futures are immovable in order to create
|
||||
|
||||
+13
-2
@@ -2139,18 +2139,29 @@ fn decode<D: Decoder>(d: &mut D) -> Result<AttrId, D::Error> {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, RustcEncodable, RustcDecodable, Debug)]
|
||||
pub struct AttrItem {
|
||||
pub path: Path,
|
||||
pub tokens: TokenStream,
|
||||
}
|
||||
|
||||
/// Metadata associated with an item.
|
||||
/// Doc-comments are promoted to attributes that have `is_sugared_doc = true`.
|
||||
#[derive(Clone, RustcEncodable, RustcDecodable, Debug)]
|
||||
pub struct Attribute {
|
||||
pub item: AttrItem,
|
||||
pub id: AttrId,
|
||||
pub style: AttrStyle,
|
||||
pub path: Path,
|
||||
pub tokens: TokenStream,
|
||||
pub is_sugared_doc: bool,
|
||||
pub span: Span,
|
||||
}
|
||||
|
||||
// Compatibility impl to avoid churn, consider removing.
|
||||
impl std::ops::Deref for Attribute {
|
||||
type Target = AttrItem;
|
||||
fn deref(&self) -> &Self::Target { &self.item }
|
||||
}
|
||||
|
||||
/// `TraitRef`s appear in impls.
|
||||
///
|
||||
/// Resolution maps each `TraitRef`'s `ref_id` to its defining trait; that's all
|
||||
|
||||
+18
-12
@@ -9,7 +9,7 @@
|
||||
pub use crate::ast::Attribute;
|
||||
|
||||
use crate::ast;
|
||||
use crate::ast::{AttrId, AttrStyle, Name, Ident, Path, PathSegment};
|
||||
use crate::ast::{AttrItem, AttrId, AttrStyle, Name, Ident, Path, PathSegment};
|
||||
use crate::ast::{MetaItem, MetaItemKind, NestedMetaItem};
|
||||
use crate::ast::{Lit, LitKind, Expr, Item, Local, Stmt, StmtKind, GenericParam};
|
||||
use crate::mut_visit::visit_clobber;
|
||||
@@ -255,9 +255,8 @@ pub fn is_meta_item_list(&self) -> bool {
|
||||
}
|
||||
}
|
||||
|
||||
impl Attribute {
|
||||
/// Extracts the `MetaItem` from inside this `Attribute`.
|
||||
pub fn meta(&self) -> Option<MetaItem> {
|
||||
impl AttrItem {
|
||||
crate fn meta(&self, span: Span) -> Option<MetaItem> {
|
||||
let mut tokens = self.tokens.trees().peekable();
|
||||
Some(MetaItem {
|
||||
path: self.path.clone(),
|
||||
@@ -269,9 +268,16 @@ pub fn meta(&self) -> Option<MetaItem> {
|
||||
} else {
|
||||
return None;
|
||||
},
|
||||
span: self.span,
|
||||
span,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl Attribute {
|
||||
/// Extracts the MetaItem from inside this Attribute.
|
||||
pub fn meta(&self) -> Option<MetaItem> {
|
||||
self.item.meta(self.span)
|
||||
}
|
||||
|
||||
pub fn parse<'a, T, F>(&self, sess: &'a ParseSess, mut f: F) -> PResult<'a, T>
|
||||
where F: FnMut(&mut Parser<'a>) -> PResult<'a, T>,
|
||||
@@ -333,10 +339,9 @@ pub fn with_desugared_doc<T, F>(&self, f: F) -> T where
|
||||
DUMMY_SP,
|
||||
);
|
||||
f(&Attribute {
|
||||
item: AttrItem { path: meta.path, tokens: meta.kind.tokens(meta.span) },
|
||||
id: self.id,
|
||||
style: self.style,
|
||||
path: meta.path,
|
||||
tokens: meta.kind.tokens(meta.span),
|
||||
is_sugared_doc: true,
|
||||
span: self.span,
|
||||
})
|
||||
@@ -384,10 +389,9 @@ pub fn mk_nested_word_item(ident: Ident) -> NestedMetaItem {
|
||||
|
||||
pub fn mk_attr(style: AttrStyle, path: Path, tokens: TokenStream, span: Span) -> Attribute {
|
||||
Attribute {
|
||||
item: AttrItem { path, tokens },
|
||||
id: mk_attr_id(),
|
||||
style,
|
||||
path,
|
||||
tokens,
|
||||
is_sugared_doc: false,
|
||||
span,
|
||||
}
|
||||
@@ -408,10 +412,12 @@ pub fn mk_sugared_doc_attr(text: Symbol, span: Span) -> Attribute {
|
||||
let lit_kind = LitKind::Str(text, ast::StrStyle::Cooked);
|
||||
let lit = Lit::from_lit_kind(lit_kind, span);
|
||||
Attribute {
|
||||
item: AttrItem {
|
||||
path: Path::from_ident(Ident::with_dummy_span(sym::doc).with_span_pos(span)),
|
||||
tokens: MetaItemKind::NameValue(lit).tokens(span),
|
||||
},
|
||||
id: mk_attr_id(),
|
||||
style,
|
||||
path: Path::from_ident(Ident::with_dummy_span(sym::doc).with_span_pos(span)),
|
||||
tokens: MetaItemKind::NameValue(lit).tokens(span),
|
||||
is_sugared_doc: true,
|
||||
span,
|
||||
}
|
||||
@@ -524,7 +530,7 @@ fn from_tokens<I>(tokens: &mut iter::Peekable<I>) -> Option<MetaItem>
|
||||
}
|
||||
Some(TokenTree::Token(Token { kind: token::Interpolated(nt), .. })) => match *nt {
|
||||
token::Nonterminal::NtIdent(ident, _) => Path::from_ident(ident),
|
||||
token::Nonterminal::NtMeta(ref meta) => return Some(meta.clone()),
|
||||
token::Nonterminal::NtMeta(ref item) => return item.meta(item.path.span),
|
||||
token::Nonterminal::NtPath(ref path) => path.clone(),
|
||||
_ => return None,
|
||||
},
|
||||
|
||||
@@ -122,8 +122,8 @@ fn process_cfg_attr(&mut self, attr: ast::Attribute) -> Vec<ast::Attribute> {
|
||||
|
||||
while !parser.check(&token::CloseDelim(token::Paren)) {
|
||||
let lo = parser.token.span.lo();
|
||||
let (path, tokens) = parser.parse_meta_item_unrestricted()?;
|
||||
expanded_attrs.push((path, tokens, parser.prev_span.with_lo(lo)));
|
||||
let item = parser.parse_attr_item()?;
|
||||
expanded_attrs.push((item, parser.prev_span.with_lo(lo)));
|
||||
parser.expect_one_of(&[token::Comma], &[token::CloseDelim(token::Paren)])?;
|
||||
}
|
||||
|
||||
@@ -150,11 +150,10 @@ fn process_cfg_attr(&mut self, attr: ast::Attribute) -> Vec<ast::Attribute> {
|
||||
// `cfg_attr` inside of another `cfg_attr`. E.g.
|
||||
// `#[cfg_attr(false, cfg_attr(true, some_attr))]`.
|
||||
expanded_attrs.into_iter()
|
||||
.flat_map(|(path, tokens, span)| self.process_cfg_attr(ast::Attribute {
|
||||
.flat_map(|(item, span)| self.process_cfg_attr(ast::Attribute {
|
||||
item,
|
||||
id: attr::mk_attr_id(),
|
||||
style: attr.style,
|
||||
path,
|
||||
tokens,
|
||||
is_sugared_doc: false,
|
||||
span,
|
||||
}))
|
||||
|
||||
+11
-22
@@ -1,4 +1,4 @@
|
||||
use crate::ast::{self, Block, Ident, LitKind, NodeId, PatKind, Path};
|
||||
use crate::ast::{self, AttrItem, Block, Ident, LitKind, NodeId, PatKind, Path};
|
||||
use crate::ast::{MacStmtStyle, StmtKind, ItemKind};
|
||||
use crate::attr::{self, HasAttrs};
|
||||
use crate::source_map::respan;
|
||||
@@ -555,15 +555,6 @@ fn fully_configure(&mut self, item: Annotatable) -> Annotatable {
|
||||
}
|
||||
|
||||
fn expand_invoc(&mut self, invoc: Invocation, ext: &SyntaxExtensionKind) -> AstFragment {
|
||||
let (fragment_kind, span) = (invoc.fragment_kind, invoc.span());
|
||||
if fragment_kind == AstFragmentKind::ForeignItems && !self.cx.ecfg.macros_in_extern() {
|
||||
if let SyntaxExtensionKind::NonMacroAttr { .. } = ext {} else {
|
||||
emit_feature_err(&self.cx.parse_sess, sym::macros_in_extern,
|
||||
span, GateIssue::Language,
|
||||
"macro invocations in `extern {}` blocks are experimental");
|
||||
}
|
||||
}
|
||||
|
||||
if self.cx.current_expansion.depth > self.cx.ecfg.recursion_limit {
|
||||
let expn_data = self.cx.current_expansion.id.expn_data();
|
||||
let suggested_limit = self.cx.ecfg.recursion_limit * 2;
|
||||
@@ -578,6 +569,7 @@ fn expand_invoc(&mut self, invoc: Invocation, ext: &SyntaxExtensionKind) -> AstF
|
||||
FatalError.raise();
|
||||
}
|
||||
|
||||
let (fragment_kind, span) = (invoc.fragment_kind, invoc.span());
|
||||
match invoc.kind {
|
||||
InvocationKind::Bang { mac, .. } => match ext {
|
||||
SyntaxExtensionKind::Bang(expander) => {
|
||||
@@ -625,9 +617,10 @@ fn expand_invoc(&mut self, invoc: Invocation, ext: &SyntaxExtensionKind) -> AstF
|
||||
| Annotatable::Variant(..)
|
||||
=> panic!("unexpected annotatable"),
|
||||
})), DUMMY_SP).into();
|
||||
let input = self.extract_proc_macro_attr_input(attr.tokens, span);
|
||||
let input = self.extract_proc_macro_attr_input(attr.item.tokens, span);
|
||||
let tok_result = expander.expand(self.cx, span, input, item_tok);
|
||||
let res = self.parse_ast_fragment(tok_result, fragment_kind, &attr.path, span);
|
||||
let res =
|
||||
self.parse_ast_fragment(tok_result, fragment_kind, &attr.item.path, span);
|
||||
self.gate_proc_macro_expansion(span, &res);
|
||||
res
|
||||
}
|
||||
@@ -757,14 +750,14 @@ fn visit_mac(&mut self, _mac: &'ast ast::Mac) {
|
||||
|
||||
fn gate_proc_macro_expansion_kind(&self, span: Span, kind: AstFragmentKind) {
|
||||
let kind = match kind {
|
||||
AstFragmentKind::Expr => "expressions",
|
||||
AstFragmentKind::Expr |
|
||||
AstFragmentKind::OptExpr => "expressions",
|
||||
AstFragmentKind::Pat => "patterns",
|
||||
AstFragmentKind::Ty => "types",
|
||||
AstFragmentKind::Stmts => "statements",
|
||||
AstFragmentKind::Items => return,
|
||||
AstFragmentKind::TraitItems => return,
|
||||
AstFragmentKind::ImplItems => return,
|
||||
AstFragmentKind::Ty |
|
||||
AstFragmentKind::Items |
|
||||
AstFragmentKind::TraitItems |
|
||||
AstFragmentKind::ImplItems |
|
||||
AstFragmentKind::ForeignItems => return,
|
||||
AstFragmentKind::Arms
|
||||
| AstFragmentKind::Fields
|
||||
@@ -1530,11 +1523,10 @@ fn visit_attribute(&mut self, at: &mut ast::Attribute) {
|
||||
|
||||
let meta = attr::mk_list_item(Ident::with_dummy_span(sym::doc), items);
|
||||
*at = attr::Attribute {
|
||||
item: AttrItem { path: meta.path, tokens: meta.kind.tokens(meta.span) },
|
||||
span: at.span,
|
||||
id: at.id,
|
||||
style: at.style,
|
||||
path: meta.path,
|
||||
tokens: meta.kind.tokens(meta.span),
|
||||
is_sugared_doc: false,
|
||||
};
|
||||
} else {
|
||||
@@ -1578,9 +1570,6 @@ pub fn default(crate_name: String) -> ExpansionConfig<'static> {
|
||||
}
|
||||
}
|
||||
|
||||
fn macros_in_extern(&self) -> bool {
|
||||
self.features.map_or(false, |features| features.macros_in_extern)
|
||||
}
|
||||
fn proc_macro_hygiene(&self) -> bool {
|
||||
self.features.map_or(false, |features| features.proc_macro_hygiene)
|
||||
}
|
||||
|
||||
@@ -924,7 +924,7 @@ fn parse_nt(p: &mut Parser<'_>, sp: Span, name: Symbol) -> Nonterminal {
|
||||
FatalError.raise()
|
||||
}
|
||||
sym::path => token::NtPath(panictry!(p.parse_path(PathStyle::Type))),
|
||||
sym::meta => token::NtMeta(panictry!(p.parse_meta_item())),
|
||||
sym::meta => token::NtMeta(panictry!(p.parse_attr_item())),
|
||||
sym::vis => token::NtVis(panictry!(p.parse_visibility(true))),
|
||||
sym::lifetime => if p.check_lifetime() {
|
||||
token::NtLifetime(p.expect_lifetime().ident)
|
||||
|
||||
@@ -245,6 +245,8 @@ macro_rules! declare_features {
|
||||
(accepted, bind_by_move_pattern_guards, "1.39.0", Some(15287), None),
|
||||
/// Allows attributes in formal function parameters.
|
||||
(accepted, param_attrs, "1.39.0", Some(60406), None),
|
||||
// Allows macro invocations in `extern {}` blocks.
|
||||
(accepted, macros_in_extern, "1.40.0", Some(49476), None),
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// feature-group-end: accepted features
|
||||
|
||||
@@ -402,9 +402,6 @@ pub fn set(&self, features: &mut Features, span: Span) {
|
||||
/// Allows infering `'static` outlives requirements (RFC 2093).
|
||||
(active, infer_static_outlives_requirements, "1.26.0", Some(54185), None),
|
||||
|
||||
/// Allows macro invocations in `extern {}` blocks.
|
||||
(active, macros_in_extern, "1.27.0", Some(49476), None),
|
||||
|
||||
/// Allows accessing fields of unions inside `const` functions.
|
||||
(active, const_fn_union, "1.27.0", Some(51909), None),
|
||||
|
||||
|
||||
@@ -550,7 +550,8 @@ pub fn noop_visit_local<T: MutVisitor>(local: &mut P<Local>, vis: &mut T) {
|
||||
}
|
||||
|
||||
pub fn noop_visit_attribute<T: MutVisitor>(attr: &mut Attribute, vis: &mut T) {
|
||||
let Attribute { id: _, style: _, path, tokens, is_sugared_doc: _, span } = attr;
|
||||
let Attribute { item: AttrItem { path, tokens }, id: _, style: _, is_sugared_doc: _, span }
|
||||
= attr;
|
||||
vis.visit_path(path);
|
||||
vis.visit_tts(tokens);
|
||||
vis.visit_span(span);
|
||||
@@ -681,7 +682,10 @@ pub fn noop_visit_interpolated<T: MutVisitor>(nt: &mut token::Nonterminal, vis:
|
||||
token::NtIdent(ident, _is_raw) => vis.visit_ident(ident),
|
||||
token::NtLifetime(ident) => vis.visit_ident(ident),
|
||||
token::NtLiteral(expr) => vis.visit_expr(expr),
|
||||
token::NtMeta(meta) => vis.visit_meta_item(meta),
|
||||
token::NtMeta(AttrItem { path, tokens }) => {
|
||||
vis.visit_path(path);
|
||||
vis.visit_tts(tokens);
|
||||
}
|
||||
token::NtPath(path) => vis.visit_path(path),
|
||||
token::NtTT(tt) => vis.visit_tt(tt),
|
||||
token::NtImplItem(item) =>
|
||||
|
||||
+20
-16
@@ -90,7 +90,7 @@ fn parse_attribute_with_inner_parse_policy(&mut self,
|
||||
debug!("parse_attribute_with_inner_parse_policy: inner_parse_policy={:?} self.token={:?}",
|
||||
inner_parse_policy,
|
||||
self.token);
|
||||
let (span, path, tokens, style) = match self.token.kind {
|
||||
let (span, item, style) = match self.token.kind {
|
||||
token::Pound => {
|
||||
let lo = self.token.span;
|
||||
self.bump();
|
||||
@@ -107,7 +107,7 @@ fn parse_attribute_with_inner_parse_policy(&mut self,
|
||||
};
|
||||
|
||||
self.expect(&token::OpenDelim(token::Bracket))?;
|
||||
let (path, tokens) = self.parse_meta_item_unrestricted()?;
|
||||
let item = self.parse_attr_item()?;
|
||||
self.expect(&token::CloseDelim(token::Bracket))?;
|
||||
let hi = self.prev_span;
|
||||
|
||||
@@ -142,7 +142,7 @@ fn parse_attribute_with_inner_parse_policy(&mut self,
|
||||
}
|
||||
}
|
||||
|
||||
(attr_sp, path, tokens, style)
|
||||
(attr_sp, item, style)
|
||||
}
|
||||
_ => {
|
||||
let token_str = self.this_token_to_string();
|
||||
@@ -151,10 +151,9 @@ fn parse_attribute_with_inner_parse_policy(&mut self,
|
||||
};
|
||||
|
||||
Ok(ast::Attribute {
|
||||
item,
|
||||
id: attr::mk_attr_id(),
|
||||
style,
|
||||
path,
|
||||
tokens,
|
||||
is_sugared_doc: false,
|
||||
span,
|
||||
})
|
||||
@@ -167,19 +166,19 @@ fn parse_attribute_with_inner_parse_policy(&mut self,
|
||||
/// PATH `[` TOKEN_STREAM `]`
|
||||
/// PATH `{` TOKEN_STREAM `}`
|
||||
/// PATH
|
||||
/// PATH `=` TOKEN_TREE
|
||||
/// PATH `=` UNSUFFIXED_LIT
|
||||
/// The delimiters or `=` are still put into the resulting token stream.
|
||||
pub fn parse_meta_item_unrestricted(&mut self) -> PResult<'a, (ast::Path, TokenStream)> {
|
||||
let meta = match self.token.kind {
|
||||
pub fn parse_attr_item(&mut self) -> PResult<'a, ast::AttrItem> {
|
||||
let item = match self.token.kind {
|
||||
token::Interpolated(ref nt) => match **nt {
|
||||
Nonterminal::NtMeta(ref meta) => Some(meta.clone()),
|
||||
Nonterminal::NtMeta(ref item) => Some(item.clone()),
|
||||
_ => None,
|
||||
},
|
||||
_ => None,
|
||||
};
|
||||
Ok(if let Some(meta) = meta {
|
||||
Ok(if let Some(item) = item {
|
||||
self.bump();
|
||||
(meta.path, meta.kind.tokens(meta.span))
|
||||
item
|
||||
} else {
|
||||
let path = self.parse_path(PathStyle::Mod)?;
|
||||
let tokens = if self.check(&token::OpenDelim(DelimToken::Paren)) ||
|
||||
@@ -206,7 +205,7 @@ pub fn parse_meta_item_unrestricted(&mut self) -> PResult<'a, (ast::Path, TokenS
|
||||
} else {
|
||||
TokenStream::empty()
|
||||
};
|
||||
(path, tokens)
|
||||
ast::AttrItem { path, tokens }
|
||||
})
|
||||
}
|
||||
|
||||
@@ -263,7 +262,7 @@ fn parse_unsuffixed_lit(&mut self) -> PResult<'a, ast::Lit> {
|
||||
|
||||
/// Matches the following grammar (per RFC 1559).
|
||||
///
|
||||
/// meta_item : IDENT ( '=' UNSUFFIXED_LIT | '(' meta_item_inner? ')' )? ;
|
||||
/// meta_item : PATH ( '=' UNSUFFIXED_LIT | '(' meta_item_inner? ')' )? ;
|
||||
/// meta_item_inner : (meta_item | UNSUFFIXED_LIT) (',' meta_item_inner)? ;
|
||||
pub fn parse_meta_item(&mut self) -> PResult<'a, ast::MetaItem> {
|
||||
let nt_meta = match self.token.kind {
|
||||
@@ -274,9 +273,14 @@ pub fn parse_meta_item(&mut self) -> PResult<'a, ast::MetaItem> {
|
||||
_ => None,
|
||||
};
|
||||
|
||||
if let Some(meta) = nt_meta {
|
||||
self.bump();
|
||||
return Ok(meta);
|
||||
if let Some(item) = nt_meta {
|
||||
return match item.meta(item.path.span) {
|
||||
Some(meta) => {
|
||||
self.bump();
|
||||
Ok(meta)
|
||||
}
|
||||
None => self.unexpected(),
|
||||
}
|
||||
}
|
||||
|
||||
let lo = self.token.span;
|
||||
|
||||
@@ -47,7 +47,7 @@ pub fn new(sess: &'a ParseSess,
|
||||
source_file: Lrc<syntax_pos::SourceFile>,
|
||||
override_span: Option<Span>) -> Self {
|
||||
if source_file.src.is_none() {
|
||||
sess.span_diagnostic.bug(&format!("Cannot lex source_file without source: {}",
|
||||
sess.span_diagnostic.bug(&format!("cannot lex `source_file` without source: {}",
|
||||
source_file.name));
|
||||
}
|
||||
|
||||
|
||||
@@ -18,6 +18,8 @@
|
||||
/// `Expected` for function and lambda parameter patterns.
|
||||
pub(super) const PARAM_EXPECTED: Expected = Some("parameter name");
|
||||
|
||||
const WHILE_PARSING_OR_MSG: &str = "while parsing this or-pattern starting here";
|
||||
|
||||
/// Whether or not an or-pattern should be gated when occurring in the current context.
|
||||
#[derive(PartialEq)]
|
||||
pub enum GateOr { Yes, No }
|
||||
@@ -40,7 +42,7 @@ pub fn parse_pat(&mut self, expected: Expected) -> PResult<'a, P<Pat>> {
|
||||
/// Corresponds to `top_pat` in RFC 2535 and allows or-pattern at the top level.
|
||||
pub(super) fn parse_top_pat(&mut self, gate_or: GateOr) -> PResult<'a, P<Pat>> {
|
||||
// Allow a '|' before the pats (RFCs 1925, 2530, and 2535).
|
||||
let gated_leading_vert = self.eat_or_separator() && gate_or == GateOr::Yes;
|
||||
let gated_leading_vert = self.eat_or_separator(None) && gate_or == GateOr::Yes;
|
||||
let leading_vert_span = self.prev_span;
|
||||
|
||||
// Parse the possibly-or-pattern.
|
||||
@@ -63,7 +65,7 @@ pub(super) fn parse_top_pat(&mut self, gate_or: GateOr) -> PResult<'a, P<Pat>> {
|
||||
/// Parse the pattern for a function or function pointer parameter.
|
||||
/// Special recovery is provided for or-patterns and leading `|`.
|
||||
pub(super) fn parse_fn_param_pat(&mut self) -> PResult<'a, P<Pat>> {
|
||||
self.recover_leading_vert("not allowed in a parameter pattern");
|
||||
self.recover_leading_vert(None, "not allowed in a parameter pattern");
|
||||
let pat = self.parse_pat_with_or(PARAM_EXPECTED, GateOr::No, RecoverComma::No)?;
|
||||
|
||||
if let PatKind::Or(..) = &pat.kind {
|
||||
@@ -90,7 +92,7 @@ fn parse_pat_with_or(
|
||||
gate_or: GateOr,
|
||||
rc: RecoverComma,
|
||||
) -> PResult<'a, P<Pat>> {
|
||||
// Parse the first pattern.
|
||||
// Parse the first pattern (`p_0`).
|
||||
let first_pat = self.parse_pat(expected)?;
|
||||
self.maybe_recover_unexpected_comma(first_pat.span, rc)?;
|
||||
|
||||
@@ -100,11 +102,12 @@ fn parse_pat_with_or(
|
||||
return Ok(first_pat)
|
||||
}
|
||||
|
||||
// Parse the patterns `p_1 | ... | p_n` where `n > 0`.
|
||||
let lo = first_pat.span;
|
||||
let mut pats = vec![first_pat];
|
||||
while self.eat_or_separator() {
|
||||
while self.eat_or_separator(Some(lo)) {
|
||||
let pat = self.parse_pat(expected).map_err(|mut err| {
|
||||
err.span_label(lo, "while parsing this or-pattern starting here");
|
||||
err.span_label(lo, WHILE_PARSING_OR_MSG);
|
||||
err
|
||||
})?;
|
||||
self.maybe_recover_unexpected_comma(pat.span, rc)?;
|
||||
@@ -122,11 +125,15 @@ fn parse_pat_with_or(
|
||||
|
||||
/// Eat the or-pattern `|` separator.
|
||||
/// If instead a `||` token is encountered, recover and pretend we parsed `|`.
|
||||
fn eat_or_separator(&mut self) -> bool {
|
||||
fn eat_or_separator(&mut self, lo: Option<Span>) -> bool {
|
||||
if self.recover_trailing_vert(lo) {
|
||||
return false;
|
||||
}
|
||||
|
||||
match self.token.kind {
|
||||
token::OrOr => {
|
||||
// Found `||`; Recover and pretend we parsed `|`.
|
||||
self.ban_unexpected_or_or();
|
||||
self.ban_unexpected_or_or(lo);
|
||||
self.bump();
|
||||
true
|
||||
}
|
||||
@@ -134,16 +141,49 @@ fn eat_or_separator(&mut self) -> bool {
|
||||
}
|
||||
}
|
||||
|
||||
/// Recover if `|` or `||` is the current token and we have one of the
|
||||
/// tokens `=>`, `if`, `=`, `:`, `;`, `,`, `]`, `)`, or `}` ahead of us.
|
||||
///
|
||||
/// These tokens all indicate that we reached the end of the or-pattern
|
||||
/// list and can now reliably say that the `|` was an illegal trailing vert.
|
||||
/// Note that there are more tokens such as `@` for which we know that the `|`
|
||||
/// is an illegal parse. However, the user's intent is less clear in that case.
|
||||
fn recover_trailing_vert(&mut self, lo: Option<Span>) -> bool {
|
||||
let is_end_ahead = self.look_ahead(1, |token| match &token.kind {
|
||||
token::FatArrow // e.g. `a | => 0,`.
|
||||
| token::Ident(kw::If, false) // e.g. `a | if expr`.
|
||||
| token::Eq // e.g. `let a | = 0`.
|
||||
| token::Semi // e.g. `let a |;`.
|
||||
| token::Colon // e.g. `let a | :`.
|
||||
| token::Comma // e.g. `let (a |,)`.
|
||||
| token::CloseDelim(token::Bracket) // e.g. `let [a | ]`.
|
||||
| token::CloseDelim(token::Paren) // e.g. `let (a | )`.
|
||||
| token::CloseDelim(token::Brace) => true, // e.g. `let A { f: a | }`.
|
||||
_ => false,
|
||||
});
|
||||
match (is_end_ahead, &self.token.kind) {
|
||||
(true, token::BinOp(token::Or)) | (true, token::OrOr) => {
|
||||
self.ban_illegal_vert(lo, "trailing", "not allowed in an or-pattern");
|
||||
self.bump();
|
||||
true
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// We have parsed `||` instead of `|`. Error and suggest `|` instead.
|
||||
fn ban_unexpected_or_or(&mut self) {
|
||||
self.struct_span_err(self.token.span, "unexpected token `||` after pattern")
|
||||
.span_suggestion(
|
||||
self.token.span,
|
||||
"use a single `|` to separate multiple alternative patterns",
|
||||
"|".to_owned(),
|
||||
Applicability::MachineApplicable
|
||||
)
|
||||
.emit();
|
||||
fn ban_unexpected_or_or(&mut self, lo: Option<Span>) {
|
||||
let mut err = self.struct_span_err(self.token.span, "unexpected token `||` after pattern");
|
||||
err.span_suggestion(
|
||||
self.token.span,
|
||||
"use a single `|` to separate multiple alternative patterns",
|
||||
"|".to_owned(),
|
||||
Applicability::MachineApplicable
|
||||
);
|
||||
if let Some(lo) = lo {
|
||||
err.span_label(lo, WHILE_PARSING_OR_MSG);
|
||||
}
|
||||
err.emit();
|
||||
}
|
||||
|
||||
/// Some special error handling for the "top-level" patterns in a match arm,
|
||||
@@ -198,25 +238,38 @@ fn skip_pat_list(&mut self) -> PResult<'a, ()> {
|
||||
/// Recursive possibly-or-pattern parser with recovery for an erroneous leading `|`.
|
||||
/// See `parse_pat_with_or` for details on parsing or-patterns.
|
||||
fn parse_pat_with_or_inner(&mut self) -> PResult<'a, P<Pat>> {
|
||||
self.recover_leading_vert("only allowed in a top-level pattern");
|
||||
self.recover_leading_vert(None, "only allowed in a top-level pattern");
|
||||
self.parse_pat_with_or(None, GateOr::Yes, RecoverComma::No)
|
||||
}
|
||||
|
||||
/// Recover if `|` or `||` is here.
|
||||
/// The user is thinking that a leading `|` is allowed in this position.
|
||||
fn recover_leading_vert(&mut self, ctx: &str) {
|
||||
fn recover_leading_vert(&mut self, lo: Option<Span>, ctx: &str) {
|
||||
if let token::BinOp(token::Or) | token::OrOr = self.token.kind {
|
||||
let span = self.token.span;
|
||||
let rm_msg = format!("remove the `{}`", pprust::token_to_string(&self.token));
|
||||
|
||||
self.struct_span_err(span, &format!("a leading `|` is {}", ctx))
|
||||
.span_suggestion(span, &rm_msg, String::new(), Applicability::MachineApplicable)
|
||||
.emit();
|
||||
|
||||
self.ban_illegal_vert(lo, "leading", ctx);
|
||||
self.bump();
|
||||
}
|
||||
}
|
||||
|
||||
/// A `|` or possibly `||` token shouldn't be here. Ban it.
|
||||
fn ban_illegal_vert(&mut self, lo: Option<Span>, pos: &str, ctx: &str) {
|
||||
let span = self.token.span;
|
||||
let mut err = self.struct_span_err(span, &format!("a {} `|` is {}", pos, ctx));
|
||||
err.span_suggestion(
|
||||
span,
|
||||
&format!("remove the `{}`", pprust::token_to_string(&self.token)),
|
||||
String::new(),
|
||||
Applicability::MachineApplicable,
|
||||
);
|
||||
if let Some(lo) = lo {
|
||||
err.span_label(lo, WHILE_PARSING_OR_MSG);
|
||||
}
|
||||
if let token::OrOr = self.token.kind {
|
||||
err.note("alternatives in or-patterns are separated with `|`, not `||`");
|
||||
}
|
||||
err.emit();
|
||||
}
|
||||
|
||||
/// Parses a pattern, with a setting whether modern range patterns (e.g., `a..=b`, `a..b` are
|
||||
/// allowed).
|
||||
fn parse_pat_with_range_pat(
|
||||
@@ -259,7 +312,7 @@ fn parse_pat_with_range_pat(
|
||||
self.bump();
|
||||
self.parse_pat_range_to(RangeEnd::Included(RangeSyntax::DotDotDot), "...")?
|
||||
}
|
||||
// At this point, token != &, &&, (, [
|
||||
// At this point, token != `&`, `&&`, `(`, `[`, `..`, `..=`, or `...`.
|
||||
_ => if self.eat_keyword(kw::Underscore) {
|
||||
// Parse _
|
||||
PatKind::Wild
|
||||
|
||||
@@ -114,9 +114,9 @@ pub fn parse_path(&mut self, style: PathStyle) -> PResult<'a, Path> {
|
||||
pub fn parse_path_allowing_meta(&mut self, style: PathStyle) -> PResult<'a, Path> {
|
||||
let meta_ident = match self.token.kind {
|
||||
token::Interpolated(ref nt) => match **nt {
|
||||
token::NtMeta(ref meta) => match meta.kind {
|
||||
ast::MetaItemKind::Word => Some(meta.path.clone()),
|
||||
_ => None,
|
||||
token::NtMeta(ref item) => match item.tokens.is_empty() {
|
||||
true => Some(item.path.clone()),
|
||||
false => None,
|
||||
},
|
||||
_ => None,
|
||||
},
|
||||
|
||||
@@ -687,7 +687,7 @@ pub enum Nonterminal {
|
||||
NtLifetime(ast::Ident),
|
||||
NtLiteral(P<ast::Expr>),
|
||||
/// Stuff inside brackets for attributes
|
||||
NtMeta(ast::MetaItem),
|
||||
NtMeta(ast::AttrItem),
|
||||
NtPath(ast::Path),
|
||||
NtVis(ast::Visibility),
|
||||
NtTT(TokenTree),
|
||||
|
||||
@@ -324,7 +324,7 @@ fn token_to_string_ext(token: &Token, convert_dollar_crate: bool) -> String {
|
||||
crate fn nonterminal_to_string(nt: &Nonterminal) -> String {
|
||||
match *nt {
|
||||
token::NtExpr(ref e) => expr_to_string(e),
|
||||
token::NtMeta(ref e) => meta_item_to_string(e),
|
||||
token::NtMeta(ref e) => attr_item_to_string(e),
|
||||
token::NtTy(ref e) => ty_to_string(e),
|
||||
token::NtPath(ref e) => path_to_string(e),
|
||||
token::NtItem(ref e) => item_to_string(e),
|
||||
@@ -412,8 +412,8 @@ pub fn meta_list_item_to_string(li: &ast::NestedMetaItem) -> String {
|
||||
to_string(|s| s.print_meta_list_item(li))
|
||||
}
|
||||
|
||||
pub fn meta_item_to_string(mi: &ast::MetaItem) -> String {
|
||||
to_string(|s| s.print_meta_item(mi))
|
||||
fn attr_item_to_string(ai: &ast::AttrItem) -> String {
|
||||
to_string(|s| s.print_attr_item(ai, ai.path.span))
|
||||
}
|
||||
|
||||
pub fn attribute_to_string(attr: &ast::Attribute) -> String {
|
||||
@@ -629,26 +629,30 @@ fn print_attribute_inline(&mut self, attr: &ast::Attribute,
|
||||
ast::AttrStyle::Inner => self.word("#!["),
|
||||
ast::AttrStyle::Outer => self.word("#["),
|
||||
}
|
||||
self.ibox(0);
|
||||
match attr.tokens.trees().next() {
|
||||
Some(TokenTree::Delimited(_, delim, tts)) => {
|
||||
self.print_mac_common(
|
||||
Some(MacHeader::Path(&attr.path)), false, None, delim, tts, true, attr.span
|
||||
);
|
||||
}
|
||||
tree => {
|
||||
self.print_path(&attr.path, false, 0);
|
||||
if tree.is_some() {
|
||||
self.space();
|
||||
self.print_tts(attr.tokens.clone(), true);
|
||||
}
|
||||
}
|
||||
}
|
||||
self.end();
|
||||
self.print_attr_item(&attr.item, attr.span);
|
||||
self.word("]");
|
||||
}
|
||||
}
|
||||
|
||||
fn print_attr_item(&mut self, item: &ast::AttrItem, span: Span) {
|
||||
self.ibox(0);
|
||||
match item.tokens.trees().next() {
|
||||
Some(TokenTree::Delimited(_, delim, tts)) => {
|
||||
self.print_mac_common(
|
||||
Some(MacHeader::Path(&item.path)), false, None, delim, tts, true, span
|
||||
);
|
||||
}
|
||||
tree => {
|
||||
self.print_path(&item.path, false, 0);
|
||||
if tree.is_some() {
|
||||
self.space();
|
||||
self.print_tts(item.tokens.clone(), true);
|
||||
}
|
||||
}
|
||||
}
|
||||
self.end();
|
||||
}
|
||||
|
||||
fn print_meta_list_item(&mut self, item: &ast::NestedMetaItem) {
|
||||
match item {
|
||||
ast::NestedMetaItem::MetaItem(ref mi) => {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
//! Attributes injected into the crate root from command line using `-Z crate-attr`.
|
||||
|
||||
use syntax::ast::{self, AttrStyle};
|
||||
use syntax::ast::{self, AttrItem, AttrStyle};
|
||||
use syntax::attr::mk_attr;
|
||||
use syntax::panictry;
|
||||
use syntax::parse::{self, token, ParseSess};
|
||||
@@ -15,7 +15,7 @@ pub fn inject(mut krate: ast::Crate, parse_sess: &ParseSess, attrs: &[String]) -
|
||||
);
|
||||
|
||||
let start_span = parser.token.span;
|
||||
let (path, tokens) = panictry!(parser.parse_meta_item_unrestricted());
|
||||
let AttrItem { path, tokens } = panictry!(parser.parse_attr_item());
|
||||
let end_span = parser.token.span;
|
||||
if parser.token != token::Eof {
|
||||
parse_sess.span_diagnostic
|
||||
|
||||
+15
-16
@@ -884,7 +884,7 @@ pub fn get_source(&self) -> Option<&str> {
|
||||
/// A single source in the `SourceMap`.
|
||||
#[derive(Clone)]
|
||||
pub struct SourceFile {
|
||||
/// The name of the file that the source came from, source that doesn't
|
||||
/// The name of the file that the source came from. Source that doesn't
|
||||
/// originate from files has names between angle brackets by convention
|
||||
/// (e.g., `<anon>`).
|
||||
pub name: FileName,
|
||||
@@ -922,9 +922,9 @@ fn encode<S: Encoder>(&self, s: &mut S) -> Result<(), S::Error> {
|
||||
s.emit_struct_field("name", 0, |s| self.name.encode(s))?;
|
||||
s.emit_struct_field("name_was_remapped", 1, |s| self.name_was_remapped.encode(s))?;
|
||||
s.emit_struct_field("src_hash", 2, |s| self.src_hash.encode(s))?;
|
||||
s.emit_struct_field("start_pos", 4, |s| self.start_pos.encode(s))?;
|
||||
s.emit_struct_field("end_pos", 5, |s| self.end_pos.encode(s))?;
|
||||
s.emit_struct_field("lines", 6, |s| {
|
||||
s.emit_struct_field("start_pos", 3, |s| self.start_pos.encode(s))?;
|
||||
s.emit_struct_field("end_pos", 4, |s| self.end_pos.encode(s))?;
|
||||
s.emit_struct_field("lines", 5, |s| {
|
||||
let lines = &self.lines[..];
|
||||
// Store the length.
|
||||
s.emit_u32(lines.len() as u32)?;
|
||||
@@ -970,13 +970,13 @@ fn encode<S: Encoder>(&self, s: &mut S) -> Result<(), S::Error> {
|
||||
|
||||
Ok(())
|
||||
})?;
|
||||
s.emit_struct_field("multibyte_chars", 7, |s| {
|
||||
s.emit_struct_field("multibyte_chars", 6, |s| {
|
||||
self.multibyte_chars.encode(s)
|
||||
})?;
|
||||
s.emit_struct_field("non_narrow_chars", 8, |s| {
|
||||
s.emit_struct_field("non_narrow_chars", 7, |s| {
|
||||
self.non_narrow_chars.encode(s)
|
||||
})?;
|
||||
s.emit_struct_field("name_hash", 9, |s| {
|
||||
s.emit_struct_field("name_hash", 8, |s| {
|
||||
self.name_hash.encode(s)
|
||||
})
|
||||
})
|
||||
@@ -985,7 +985,6 @@ fn encode<S: Encoder>(&self, s: &mut S) -> Result<(), S::Error> {
|
||||
|
||||
impl Decodable for SourceFile {
|
||||
fn decode<D: Decoder>(d: &mut D) -> Result<SourceFile, D::Error> {
|
||||
|
||||
d.read_struct("SourceFile", 8, |d| {
|
||||
let name: FileName = d.read_struct_field("name", 0, |d| Decodable::decode(d))?;
|
||||
let name_was_remapped: bool =
|
||||
@@ -993,9 +992,9 @@ fn decode<D: Decoder>(d: &mut D) -> Result<SourceFile, D::Error> {
|
||||
let src_hash: u128 =
|
||||
d.read_struct_field("src_hash", 2, |d| Decodable::decode(d))?;
|
||||
let start_pos: BytePos =
|
||||
d.read_struct_field("start_pos", 4, |d| Decodable::decode(d))?;
|
||||
let end_pos: BytePos = d.read_struct_field("end_pos", 5, |d| Decodable::decode(d))?;
|
||||
let lines: Vec<BytePos> = d.read_struct_field("lines", 6, |d| {
|
||||
d.read_struct_field("start_pos", 3, |d| Decodable::decode(d))?;
|
||||
let end_pos: BytePos = d.read_struct_field("end_pos", 4, |d| Decodable::decode(d))?;
|
||||
let lines: Vec<BytePos> = d.read_struct_field("lines", 5, |d| {
|
||||
let num_lines: u32 = Decodable::decode(d)?;
|
||||
let mut lines = Vec::with_capacity(num_lines as usize);
|
||||
|
||||
@@ -1024,18 +1023,18 @@ fn decode<D: Decoder>(d: &mut D) -> Result<SourceFile, D::Error> {
|
||||
Ok(lines)
|
||||
})?;
|
||||
let multibyte_chars: Vec<MultiByteChar> =
|
||||
d.read_struct_field("multibyte_chars", 7, |d| Decodable::decode(d))?;
|
||||
d.read_struct_field("multibyte_chars", 6, |d| Decodable::decode(d))?;
|
||||
let non_narrow_chars: Vec<NonNarrowChar> =
|
||||
d.read_struct_field("non_narrow_chars", 8, |d| Decodable::decode(d))?;
|
||||
d.read_struct_field("non_narrow_chars", 7, |d| Decodable::decode(d))?;
|
||||
let name_hash: u128 =
|
||||
d.read_struct_field("name_hash", 9, |d| Decodable::decode(d))?;
|
||||
d.read_struct_field("name_hash", 8, |d| Decodable::decode(d))?;
|
||||
Ok(SourceFile {
|
||||
name,
|
||||
name_was_remapped,
|
||||
unmapped_path: None,
|
||||
// `crate_of_origin` has to be set by the importer.
|
||||
// This value matches up with rustc::hir::def_id::INVALID_CRATE.
|
||||
// That constant is not available here unfortunately :(
|
||||
// This value matches up with `rustc::hir::def_id::INVALID_CRATE`.
|
||||
// That constant is not available here, unfortunately.
|
||||
crate_of_origin: std::u32::MAX - 1,
|
||||
start_pos,
|
||||
end_pos,
|
||||
|
||||
@@ -1,30 +0,0 @@
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:26:5
|
||||
|
|
||||
LL | returns_isize!(rust_get_test_int);
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:28:5
|
||||
|
|
||||
LL | takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:30:5
|
||||
|
|
||||
LL | emits_nothing!();
|
||||
| ^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error: aborting due to 3 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0658`.
|
||||
@@ -1,112 +0,0 @@
|
||||
// force-host
|
||||
// no-prefer-dynamic
|
||||
|
||||
// Proc macros commonly used by tests.
|
||||
// `panic`/`print` -> `panic_bang`/`print_bang` to avoid conflicts with standard macros.
|
||||
|
||||
#![crate_type = "proc-macro"]
|
||||
|
||||
extern crate proc_macro;
|
||||
use proc_macro::TokenStream;
|
||||
|
||||
// Macro that return empty token stream.
|
||||
|
||||
#[proc_macro]
|
||||
pub fn empty(_: TokenStream) -> TokenStream {
|
||||
TokenStream::new()
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn empty_attr(_: TokenStream, _: TokenStream) -> TokenStream {
|
||||
TokenStream::new()
|
||||
}
|
||||
|
||||
#[proc_macro_derive(Empty, attributes(empty_helper))]
|
||||
pub fn empty_derive(_: TokenStream) -> TokenStream {
|
||||
TokenStream::new()
|
||||
}
|
||||
|
||||
// Macro that panics.
|
||||
|
||||
#[proc_macro]
|
||||
pub fn panic_bang(_: TokenStream) -> TokenStream {
|
||||
panic!("panic-bang");
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn panic_attr(_: TokenStream, _: TokenStream) -> TokenStream {
|
||||
panic!("panic-attr");
|
||||
}
|
||||
|
||||
#[proc_macro_derive(Panic, attributes(panic_helper))]
|
||||
pub fn panic_derive(_: TokenStream) -> TokenStream {
|
||||
panic!("panic-derive");
|
||||
}
|
||||
|
||||
// Macros that return the input stream.
|
||||
|
||||
#[proc_macro]
|
||||
pub fn identity(input: TokenStream) -> TokenStream {
|
||||
input
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn identity_attr(_: TokenStream, input: TokenStream) -> TokenStream {
|
||||
input
|
||||
}
|
||||
|
||||
#[proc_macro_derive(Identity, attributes(identity_helper))]
|
||||
pub fn identity_derive(input: TokenStream) -> TokenStream {
|
||||
input
|
||||
}
|
||||
|
||||
// Macros that iterate and re-collect the input stream.
|
||||
|
||||
#[proc_macro]
|
||||
pub fn recollect(input: TokenStream) -> TokenStream {
|
||||
input.into_iter().collect()
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn recollect_attr(_: TokenStream, input: TokenStream) -> TokenStream {
|
||||
input.into_iter().collect()
|
||||
}
|
||||
|
||||
#[proc_macro_derive(Recollect, attributes(recollect_helper))]
|
||||
pub fn recollect_derive(input: TokenStream) -> TokenStream {
|
||||
input.into_iter().collect()
|
||||
}
|
||||
|
||||
// Macros that print their input in the original and re-collected forms (if they differ).
|
||||
|
||||
fn print_helper(input: TokenStream, kind: &str) -> TokenStream {
|
||||
let input_display = format!("{}", input);
|
||||
let input_debug = format!("{:#?}", input);
|
||||
let recollected = input.into_iter().collect();
|
||||
let recollected_display = format!("{}", recollected);
|
||||
let recollected_debug = format!("{:#?}", recollected);
|
||||
println!("PRINT-{} INPUT (DISPLAY): {}", kind, input_display);
|
||||
if recollected_display != input_display {
|
||||
println!("PRINT-{} RE-COLLECTED (DISPLAY): {}", kind, recollected_display);
|
||||
}
|
||||
println!("PRINT-{} INPUT (DEBUG): {}", kind, input_debug);
|
||||
if recollected_debug != input_debug {
|
||||
println!("PRINT-{} RE-COLLECTED (DEBUG): {}", kind, recollected_debug);
|
||||
}
|
||||
recollected
|
||||
}
|
||||
|
||||
#[proc_macro]
|
||||
pub fn print_bang(input: TokenStream) -> TokenStream {
|
||||
print_helper(input, "BANG")
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn print_attr(_: TokenStream, input: TokenStream) -> TokenStream {
|
||||
print_helper(input, "ATTR")
|
||||
}
|
||||
|
||||
#[proc_macro_derive(Print, attributes(print_helper))]
|
||||
pub fn print_derive(input: TokenStream) -> TokenStream {
|
||||
print_helper(input, "DERIVE")
|
||||
}
|
||||
@@ -1,30 +0,0 @@
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:14:5
|
||||
|
|
||||
LL | #[empty_attr]
|
||||
| ^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:18:5
|
||||
|
|
||||
LL | #[identity_attr]
|
||||
| ^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/macros-in-extern.rs:22:5
|
||||
|
|
||||
LL | identity!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error: aborting due to 3 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0658`.
|
||||
@@ -1,26 +1,15 @@
|
||||
fn main() {
|
||||
f1(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
f2(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
f3(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
f4(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
f5(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
g1(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
g2(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
g3(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
g4(|_: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
h1(|_: (), _: (), _: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
h2(|_: (), _: (), _: (), _: ()| {}); //~ ERROR type mismatch
|
||||
//~^ ERROR type mismatch
|
||||
}
|
||||
|
||||
// Basic
|
||||
|
||||
@@ -10,18 +10,7 @@ LL | fn f1<F>(_: F) where F: Fn(&(), &()) {}
|
||||
| -- ------------ required by this bound in `f1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:2:5
|
||||
|
|
||||
LL | f1(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), &()) -> _`
|
||||
...
|
||||
LL | fn f1<F>(_: F) where F: Fn(&(), &()) {}
|
||||
| -- ------------ required by this bound in `f1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:4:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:3:5
|
||||
|
|
||||
LL | f2(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -34,17 +23,6 @@ LL | fn f2<F>(_: F) where F: for<'a> Fn(&'a (), &()) {}
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:4:5
|
||||
|
|
||||
LL | f2(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&'a (), &()) -> _`
|
||||
...
|
||||
LL | fn f2<F>(_: F) where F: for<'a> Fn(&'a (), &()) {}
|
||||
| -- --------------- required by this bound in `f2`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
||||
|
|
||||
LL | f3(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
@@ -54,18 +32,7 @@ LL | fn f3<'a, F>(_: F) where F: Fn(&'a (), &()) {}
|
||||
| -- --------------- required by this bound in `f3`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
||||
|
|
||||
LL | f3(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), &()) -> _`
|
||||
...
|
||||
LL | fn f3<'a, F>(_: F) where F: Fn(&'a (), &()) {}
|
||||
| -- --------------- required by this bound in `f3`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:5:5
|
||||
|
|
||||
LL | f4(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -76,18 +43,7 @@ LL | fn f4<F>(_: F) where F: for<'r> Fn(&(), &'r ()) {}
|
||||
| -- ----------------------- required by this bound in `f4`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
|
||||
|
|
||||
LL | f4(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), &'r ()) -> _`
|
||||
...
|
||||
LL | fn f4<F>(_: F) where F: for<'r> Fn(&(), &'r ()) {}
|
||||
| -- --------------- required by this bound in `f4`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:6:5
|
||||
|
|
||||
LL | f5(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -98,18 +54,7 @@ LL | fn f5<F>(_: F) where F: for<'r> Fn(&'r (), &'r ()) {}
|
||||
| -- -------------------------- required by this bound in `f5`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
|
||||
|
|
||||
LL | f5(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&'r (), &'r ()) -> _`
|
||||
...
|
||||
LL | fn f5<F>(_: F) where F: for<'r> Fn(&'r (), &'r ()) {}
|
||||
| -- ------------------ required by this bound in `f5`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:7:5
|
||||
|
|
||||
LL | g1(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -120,18 +65,7 @@ LL | fn g1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>) {}
|
||||
| -- ------------------------- required by this bound in `g1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
|
||||
|
|
||||
LL | g1(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>) -> _`
|
||||
...
|
||||
LL | fn g1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>) {}
|
||||
| -- ------------------------- required by this bound in `g1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:14:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:8:5
|
||||
|
|
||||
LL | g2(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -142,18 +76,7 @@ LL | fn g2<F>(_: F) where F: Fn(&(), fn(&())) {}
|
||||
| -- ---------------- required by this bound in `g2`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:14:5
|
||||
|
|
||||
LL | g2(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), for<'r> fn(&'r ())) -> _`
|
||||
...
|
||||
LL | fn g2<F>(_: F) where F: Fn(&(), fn(&())) {}
|
||||
| -- ---------------- required by this bound in `g2`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:16:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:9:5
|
||||
|
|
||||
LL | g3(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -164,18 +87,7 @@ LL | fn g3<F>(_: F) where F: for<'s> Fn(&'s (), Box<dyn Fn(&())>) {}
|
||||
| -- ------------------------------------ required by this bound in `g3`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:16:5
|
||||
|
|
||||
LL | g3(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&'s (), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>) -> _`
|
||||
...
|
||||
LL | fn g3<F>(_: F) where F: for<'s> Fn(&'s (), Box<dyn Fn(&())>) {}
|
||||
| -- ---------------------------- required by this bound in `g3`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:18:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:10:5
|
||||
|
|
||||
LL | g4(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
@@ -186,18 +98,7 @@ LL | fn g4<F>(_: F) where F: Fn(&(), for<'r> fn(&'r ())) {}
|
||||
| -- --------------------------- required by this bound in `g4`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:18:5
|
||||
|
|
||||
LL | g4(|_: (), _: ()| {});
|
||||
| ^^ -------------- found signature of `fn((), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), for<'r> fn(&'r ())) -> _`
|
||||
...
|
||||
LL | fn g4<F>(_: F) where F: Fn(&(), for<'r> fn(&'r ())) {}
|
||||
| -- --------------------------- required by this bound in `g4`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:20:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:11:5
|
||||
|
|
||||
LL | h1(|_: (), _: (), _: (), _: ()| {});
|
||||
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
|
||||
@@ -208,18 +109,7 @@ LL | fn h1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>, &(), fn(&(), &())) {}
|
||||
| -- -------------------------------------------- required by this bound in `h1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:20:5
|
||||
|
|
||||
LL | h1(|_: (), _: (), _: (), _: ()| {});
|
||||
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>, &(), for<'r, 's> fn(&'r (), &'s ())) -> _`
|
||||
...
|
||||
LL | fn h1<F>(_: F) where F: Fn(&(), Box<dyn Fn(&())>, &(), fn(&(), &())) {}
|
||||
| -- -------------------------------------------- required by this bound in `h1`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:22:5
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:12:5
|
||||
|
|
||||
LL | h2(|_: (), _: (), _: (), _: ()| {});
|
||||
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
|
||||
@@ -229,16 +119,5 @@ LL | h2(|_: (), _: (), _: (), _: ()| {});
|
||||
LL | fn h2<F>(_: F) where F: for<'t0> Fn(&(), Box<dyn Fn(&())>, &'t0 (), fn(&(), &())) {}
|
||||
| -- --------------------------------------------------------- required by this bound in `h2`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/anonymous-higher-ranked-lifetime.rs:22:5
|
||||
|
|
||||
LL | h2(|_: (), _: (), _: (), _: ()| {});
|
||||
| ^^ ---------------------------- found signature of `fn((), (), (), ()) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&(), std::boxed::Box<(dyn for<'r> std::ops::Fn(&'r ()) + 'static)>, &'t0 (), for<'r, 's> fn(&'r (), &'s ())) -> _`
|
||||
...
|
||||
LL | fn h2<F>(_: F) where F: for<'t0> Fn(&(), Box<dyn Fn(&())>, &'t0 (), fn(&(), &())) {}
|
||||
| -- ------------------------------------------------ required by this bound in `h2`
|
||||
|
||||
error: aborting due to 22 previous errors
|
||||
error: aborting due to 11 previous errors
|
||||
|
||||
|
||||
@@ -9,9 +9,9 @@ LL | assert_send(local_dropped_before_await());
|
||||
|
|
||||
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::rc::Rc<()>`
|
||||
= note: required because it appears within the type `impl std::fmt::Debug`
|
||||
= note: required because it appears within the type `{impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `{impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:21:39: 26:2 {impl std::fmt::Debug, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
|
||||
@@ -26,9 +26,9 @@ LL | assert_send(non_send_temporary_in_match());
|
||||
|
|
||||
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::rc::Rc<()>`
|
||||
= note: required because it appears within the type `impl std::fmt::Debug`
|
||||
= note: required because it appears within the type `{fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `{fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:28:40: 37:2 {fn(impl std::fmt::Debug) -> std::option::Option<impl std::fmt::Debug> {std::option::Option::<impl std::fmt::Debug>::Some}, fn() -> impl std::fmt::Debug {non_send}, impl std::fmt::Debug, std::option::Option<impl std::fmt::Debug>, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
|
||||
@@ -45,9 +45,9 @@ LL | assert_send(non_sync_with_method_call());
|
||||
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut dyn std::fmt::Write`
|
||||
= note: required because it appears within the type `std::fmt::Formatter<'_>`
|
||||
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut std::fmt::Formatter<'_>`
|
||||
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
|
||||
@@ -68,9 +68,9 @@ LL | assert_send(non_sync_with_method_call());
|
||||
= note: required because of the requirements on the impl of `std::marker::Send` for `std::slice::Iter<'_, std::fmt::ArgumentV1<'_>>`
|
||||
= note: required because it appears within the type `std::fmt::Formatter<'_>`
|
||||
= note: required because of the requirements on the impl of `std::marker::Send` for `&mut std::fmt::Formatter<'_>`
|
||||
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}`
|
||||
= note: required because it appears within the type `[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]`
|
||||
= note: required because it appears within the type `std::future::GenFuture<[static generator@$DIR/async-fn-nonsend.rs:39:38: 45:2 for<'r, 's> {&'r mut std::fmt::Formatter<'s>, bool, bool, fn() -> impl std::future::Future {fut}, impl std::future::Future, impl std::future::Future, ()}]>`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
= note: required because it appears within the type `impl std::future::Future`
|
||||
|
||||
|
||||
@@ -0,0 +1,25 @@
|
||||
// edition:2018
|
||||
|
||||
use std::sync::Mutex;
|
||||
|
||||
fn is_send<T: Send>(t: T) {
|
||||
|
||||
}
|
||||
|
||||
async fn foo() {
|
||||
bar(&Mutex::new(22)).await;
|
||||
}
|
||||
|
||||
async fn bar(x: &Mutex<u32>) {
|
||||
let g = x.lock().unwrap();
|
||||
baz().await;
|
||||
}
|
||||
|
||||
async fn baz() {
|
||||
|
||||
}
|
||||
|
||||
fn main() {
|
||||
is_send(foo());
|
||||
//~^ ERROR `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely [E0277]
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
error[E0277]: `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely
|
||||
--> $DIR/issue-64130-non-send-future-diags.rs:23:5
|
||||
|
|
||||
LL | fn is_send<T: Send>(t: T) {
|
||||
| ------- ---- required by this bound in `is_send`
|
||||
...
|
||||
LL | is_send(foo());
|
||||
| ^^^^^^^ `std::sync::MutexGuard<'_, u32>` cannot be sent between threads safely
|
||||
|
|
||||
= help: within `impl std::future::Future`, the trait `std::marker::Send` is not implemented for `std::sync::MutexGuard<'_, u32>`
|
||||
note: future does not implement `std::marker::Send` as this value is used across an await
|
||||
--> $DIR/issue-64130-non-send-future-diags.rs:15:5
|
||||
|
|
||||
LL | let g = x.lock().unwrap();
|
||||
| - has type `std::sync::MutexGuard<'_, u32>`
|
||||
LL | baz().await;
|
||||
| ^^^^^^^^^^^ await occurs here, with `g` maybe used later
|
||||
LL | }
|
||||
| - `g` is later dropped here
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
For more information about this error, try `rustc --explain E0277`.
|
||||
@@ -0,0 +1,12 @@
|
||||
// edition:2018
|
||||
#![deny(unreachable_code)]
|
||||
|
||||
async fn foo() {
|
||||
return; bar().await;
|
||||
//~^ ERROR unreachable statement
|
||||
}
|
||||
|
||||
async fn bar() {
|
||||
}
|
||||
|
||||
fn main() { }
|
||||
@@ -0,0 +1,16 @@
|
||||
error: unreachable statement
|
||||
--> $DIR/unreachable-lint-1.rs:5:13
|
||||
|
|
||||
LL | return; bar().await;
|
||||
| ------ ^^^^^^^^^^^^ unreachable statement
|
||||
| |
|
||||
| any code following this expression is unreachable
|
||||
|
|
||||
note: lint level defined here
|
||||
--> $DIR/unreachable-lint-1.rs:2:9
|
||||
|
|
||||
LL | #![deny(unreachable_code)]
|
||||
| ^^^^^^^^^^^^^^^^
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -0,0 +1,13 @@
|
||||
// check-pass
|
||||
// edition:2018
|
||||
#![deny(unreachable_code)]
|
||||
|
||||
async fn foo() {
|
||||
endless().await;
|
||||
}
|
||||
|
||||
async fn endless() -> ! {
|
||||
loop {}
|
||||
}
|
||||
|
||||
fn main() { }
|
||||
@@ -57,7 +57,7 @@ fn main() {
|
||||
// check that macro expanded code works
|
||||
|
||||
macro_rules! if_cfg {
|
||||
($cfg:meta $ib:block else $eb:block) => {
|
||||
($cfg:meta? $ib:block else $eb:block) => {
|
||||
{
|
||||
let r;
|
||||
#[cfg($cfg)]
|
||||
@@ -69,7 +69,7 @@ macro_rules! if_cfg {
|
||||
}
|
||||
}
|
||||
|
||||
let n = if_cfg!(unset {
|
||||
let n = if_cfg!(unset? {
|
||||
413
|
||||
} else {
|
||||
612
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | s.the_fn();
|
||||
| ^^^^^^ method not found in `&Lib::TheStruct`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use Lib::TheTrait;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | s.the_fn();
|
||||
| ^^^^^^ method not found in `&Lib::TheStruct`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use Lib::TheTrait;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | s.the_fn();
|
||||
| ^^^^^^ method not found in `&coherence_inherent_cc_lib::TheStruct`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use coherence_inherent_cc_lib::TheTrait;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | s.the_fn();
|
||||
| ^^^^^^ method not found in `&coherence_inherent_cc_lib::TheStruct`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use coherence_inherent_cc_lib::TheTrait;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
#![feature(decl_macro)]
|
||||
|
||||
macro_rules! returns_isize(
|
||||
($ident:ident) => (
|
||||
fn $ident() -> isize;
|
||||
)
|
||||
);
|
||||
|
||||
macro takes_u32_returns_u32($ident:ident) {
|
||||
fn $ident (arg: u32) -> u32;
|
||||
}
|
||||
|
||||
macro_rules! emits_nothing(
|
||||
() => ()
|
||||
);
|
||||
|
||||
#[link(name = "rust_test_helpers", kind = "static")]
|
||||
extern {
|
||||
returns_isize!(rust_get_test_int);
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
emits_nothing!();
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
}
|
||||
|
||||
fn main() {}
|
||||
@@ -1,30 +0,0 @@
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/feature-gate-macros_in_extern.rs:19:5
|
||||
|
|
||||
LL | returns_isize!(rust_get_test_int);
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/feature-gate-macros_in_extern.rs:21:5
|
||||
|
|
||||
LL | takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: macro invocations in `extern {}` blocks are experimental
|
||||
--> $DIR/feature-gate-macros_in_extern.rs:23:5
|
||||
|
|
||||
LL | emits_nothing!();
|
||||
| ^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/49476
|
||||
= help: add `#![feature(macros_in_extern)]` to the crate attributes to enable
|
||||
|
||||
error: aborting due to 3 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0658`.
|
||||
@@ -25,7 +25,7 @@ LL | ().clone()
|
||||
| ^^^^^ method not found in `()`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use std::clone::Clone;`
|
||||
|
||||
error: aborting due to 3 previous errors
|
||||
|
||||
@@ -8,7 +8,7 @@ LL | pub macro m() { ().f() }
|
||||
| ^ method not found in `()`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use foo::T;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | 1u32.method();
|
||||
| ^^^^^^ method not found in `u32`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following traits are implemented but not in scope, perhaps add a `use` for one of them:
|
||||
help: the following traits are implemented but not in scope; perhaps add a `use` for one of them:
|
||||
|
|
||||
LL | use foo::Bar;
|
||||
|
|
||||
@@ -23,7 +23,7 @@ LL | std::rc::Rc::new(&mut Box::new(&1u32)).method();
|
||||
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&u32>>`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following traits are implemented but not in scope, perhaps add a `use` for one of them:
|
||||
help: the following traits are implemented but not in scope; perhaps add a `use` for one of them:
|
||||
|
|
||||
LL | use foo::Bar;
|
||||
|
|
||||
@@ -41,7 +41,7 @@ LL | 'a'.method();
|
||||
| ^^^^^^ method not found in `char`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use foo::Bar;
|
||||
|
|
||||
@@ -61,7 +61,7 @@ LL | std::rc::Rc::new(&mut Box::new(&'a')).method();
|
||||
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&char>>`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use foo::Bar;
|
||||
|
|
||||
@@ -73,7 +73,7 @@ LL | 1i32.method();
|
||||
| ^^^^^^ method not found in `i32`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use no_method_suggested_traits::foo::PubPub;
|
||||
|
|
||||
@@ -85,7 +85,7 @@ LL | std::rc::Rc::new(&mut Box::new(&1i32)).method();
|
||||
| ^^^^^^ method not found in `std::rc::Rc<&mut std::boxed::Box<&i32>>`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use no_method_suggested_traits::foo::PubPub;
|
||||
|
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | b.foo();
|
||||
| ^^^ method not found in `&b::B`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use a::A;`
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -8,7 +8,7 @@ help: possible better candidate is found in another module, you can import it in
|
||||
LL | use std::hash::Hash;
|
||||
|
|
||||
|
||||
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default. Only `?Sized` is supported
|
||||
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default; only `?Sized` is supported
|
||||
--> $DIR/issue-37534.rs:1:12
|
||||
|
|
||||
LL | struct Foo<T: ?Hash> { }
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | Command::new("echo").arg("hello").exec();
|
||||
| ^^^^ method not found in `&mut std::process::Command`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use std::os::unix::process::CommandExt;
|
||||
|
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | ().a();
|
||||
| ^ method not found in `()`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use xcrate_issue_43189_b::xcrate_issue_43189_a::A;
|
||||
|
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
trait Foo {
|
||||
fn foo([a, b]: [i32; 2]) {}
|
||||
//~^ ERROR: patterns aren't allowed in methods without bodies
|
||||
}
|
||||
|
||||
fn main() {}
|
||||
@@ -0,0 +1,13 @@
|
||||
error[E0642]: patterns aren't allowed in methods without bodies
|
||||
--> $DIR/issue-50571.rs:2:12
|
||||
|
|
||||
LL | fn foo([a, b]: [i32; 2]) {}
|
||||
| ^^^^^^
|
||||
help: give this argument a name or use an underscore to ignore it
|
||||
|
|
||||
LL | fn foo(_: [i32; 2]) {}
|
||||
| ^
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
For more information about this error, try `rustc --explain E0642`.
|
||||
@@ -0,0 +1,18 @@
|
||||
pub trait Foo: Sized {
|
||||
const SIZE: usize;
|
||||
|
||||
fn new(slice: &[u8; Foo::SIZE]) -> Self;
|
||||
//~^ ERROR: type annotations needed: cannot resolve `_: Foo`
|
||||
}
|
||||
|
||||
pub struct Bar<T: ?Sized>(T);
|
||||
|
||||
impl Bar<[u8]> {
|
||||
const SIZE: usize = 32;
|
||||
|
||||
fn new(slice: &[u8; Self::SIZE]) -> Self {
|
||||
Foo(Box::new(*slice)) //~ ERROR: expected function, found trait `Foo`
|
||||
}
|
||||
}
|
||||
|
||||
fn main() {}
|
||||
@@ -0,0 +1,19 @@
|
||||
error[E0423]: expected function, found trait `Foo`
|
||||
--> $DIR/issue-58022.rs:14:9
|
||||
|
|
||||
LL | Foo(Box::new(*slice))
|
||||
| ^^^ not a function
|
||||
|
||||
error[E0283]: type annotations needed: cannot resolve `_: Foo`
|
||||
--> $DIR/issue-58022.rs:4:25
|
||||
|
|
||||
LL | const SIZE: usize;
|
||||
| ------------------ required by `Foo::SIZE`
|
||||
LL |
|
||||
LL | fn new(slice: &[u8; Foo::SIZE]) -> Self;
|
||||
| ^^^^^^^^^
|
||||
|
||||
error: aborting due to 2 previous errors
|
||||
|
||||
Some errors have detailed explanations: E0283, E0423.
|
||||
For more information about an error, try `rustc --explain E0283`.
|
||||
@@ -0,0 +1,50 @@
|
||||
use std::ops::Add;
|
||||
|
||||
trait Trait<T> {
|
||||
fn get(self) -> T;
|
||||
}
|
||||
|
||||
struct Holder<T>(T);
|
||||
|
||||
impl<T> Trait<T> for Holder<T> {
|
||||
fn get(self) -> T {
|
||||
self.0
|
||||
}
|
||||
}
|
||||
|
||||
enum Either<L, R> {
|
||||
Left(L),
|
||||
Right(R),
|
||||
}
|
||||
|
||||
impl<L, R> Either<L, R> {
|
||||
fn converge<T>(self) -> T where L: Trait<T>, R: Trait<T> {
|
||||
match self {
|
||||
Either::Left(val) => val.get(),
|
||||
Either::Right(val) => val.get(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn add_generic<A: Add<B>, B>(lhs: A, rhs: B) -> Either<
|
||||
impl Trait<<A as Add<B>>::Output>,
|
||||
impl Trait<<A as Add<B>>::Output>
|
||||
> {
|
||||
if true {
|
||||
Either::Left(Holder(lhs + rhs))
|
||||
} else {
|
||||
Either::Right(Holder(lhs + rhs))
|
||||
}
|
||||
}
|
||||
|
||||
fn add_one(
|
||||
value: u32,
|
||||
) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
|
||||
//~^ ERROR: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>`
|
||||
//~| ERROR: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>`
|
||||
add_generic(value, 1u32)
|
||||
}
|
||||
|
||||
pub fn main() {
|
||||
add_one(3).converge();
|
||||
}
|
||||
@@ -0,0 +1,19 @@
|
||||
error[E0277]: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>` is not satisfied
|
||||
--> $DIR/issue-58344.rs:42:13
|
||||
|
|
||||
LL | ) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `Trait<u32>` is not implemented for `impl Trait<<u32 as std::ops::Add>::Output>`
|
||||
|
|
||||
= note: the return type of a function must have a statically known size
|
||||
|
||||
error[E0277]: the trait bound `impl Trait<<u32 as std::ops::Add>::Output>: Trait<u32>` is not satisfied
|
||||
--> $DIR/issue-58344.rs:42:52
|
||||
|
|
||||
LL | ) -> Either<impl Trait<<u32 as Add<u32>>::Output>, impl Trait<<u32 as Add<u32>>::Output>> {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `Trait<u32>` is not implemented for `impl Trait<<u32 as std::ops::Add>::Output>`
|
||||
|
|
||||
= note: the return type of a function must have a statically known size
|
||||
|
||||
error: aborting due to 2 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0277`.
|
||||
@@ -1,5 +1,3 @@
|
||||
#![feature(macros_in_extern)]
|
||||
|
||||
macro_rules! m {
|
||||
() => {
|
||||
let //~ ERROR expected
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
error: expected one of `crate`, `fn`, `pub`, `static`, or `type`, found `let`
|
||||
--> $DIR/issue-54441.rs:5:9
|
||||
--> $DIR/issue-54441.rs:3:9
|
||||
|
|
||||
LL | let
|
||||
| ^^^ unexpected token
|
||||
|
||||
@@ -252,12 +252,6 @@ macro_rules! test_path {
|
||||
test_path!(std::u8,);
|
||||
test_path!(any, super, super::super::self::path, X<Y>::Z<'a, T=U>);
|
||||
|
||||
macro_rules! test_meta_block {
|
||||
($($m:meta)* $b:block) => {};
|
||||
}
|
||||
|
||||
test_meta_block!(windows {});
|
||||
|
||||
macro_rules! test_lifetime {
|
||||
(1. $($l:lifetime)* $($b:block)*) => {};
|
||||
(2. $($b:block)* $($l:lifetime)*) => {};
|
||||
|
||||
@@ -0,0 +1,11 @@
|
||||
// check-pass
|
||||
|
||||
macro_rules! check { ($meta:meta) => () }
|
||||
|
||||
check!(meta(a b c d));
|
||||
check!(meta[a b c d]);
|
||||
check!(meta { a b c d });
|
||||
check!(meta);
|
||||
check!(meta = 0);
|
||||
|
||||
fn main() {}
|
||||
@@ -1,30 +0,0 @@
|
||||
// run-pass
|
||||
// ignore-wasm32
|
||||
|
||||
#![feature(decl_macro, macros_in_extern)]
|
||||
|
||||
macro_rules! returns_isize(
|
||||
($ident:ident) => (
|
||||
fn $ident() -> isize;
|
||||
)
|
||||
);
|
||||
|
||||
macro takes_u32_returns_u32($ident:ident) {
|
||||
fn $ident (arg: u32) -> u32;
|
||||
}
|
||||
|
||||
macro_rules! emits_nothing(
|
||||
() => ()
|
||||
);
|
||||
|
||||
fn main() {
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 1isize);
|
||||
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEFu32);
|
||||
}
|
||||
|
||||
#[link(name = "rust_test_helpers", kind = "static")]
|
||||
extern {
|
||||
returns_isize!(rust_get_test_int);
|
||||
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
|
||||
emits_nothing!();
|
||||
}
|
||||
+17
-4
@@ -1,3 +1,4 @@
|
||||
// run-pass
|
||||
// ignore-wasm32
|
||||
|
||||
#![feature(decl_macro)]
|
||||
@@ -16,17 +17,29 @@ macro_rules! emits_nothing(
|
||||
() => ()
|
||||
);
|
||||
|
||||
macro_rules! emits_multiple(
|
||||
() => {
|
||||
fn f1() -> u32;
|
||||
fn f2() -> u32;
|
||||
}
|
||||
);
|
||||
|
||||
mod defs {
|
||||
#[no_mangle] extern fn f1() -> u32 { 1 }
|
||||
#[no_mangle] extern fn f2() -> u32 { 2 }
|
||||
}
|
||||
|
||||
fn main() {
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 0isize);
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 1);
|
||||
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEFu32);
|
||||
assert_eq!(unsafe { f1() }, 1);
|
||||
assert_eq!(unsafe { f2() }, 2);
|
||||
}
|
||||
|
||||
#[link(name = "rust_test_helpers", kind = "static")]
|
||||
extern {
|
||||
returns_isize!(rust_get_test_int);
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
takes_u32_returns_u32!(rust_dbg_extern_identity_u32);
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
emits_nothing!();
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
emits_multiple!();
|
||||
}
|
||||
@@ -34,7 +34,7 @@ error[E0203]: type parameter has more than one relaxed default bound, only one i
|
||||
LL | struct S5<T>(*const T) where T: ?Trait<'static> + ?Sized;
|
||||
| ^
|
||||
|
||||
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default. Only `?Sized` is supported
|
||||
warning: default bound relaxed for a type parameter, but this does nothing because the given bound is not a default; only `?Sized` is supported
|
||||
--> $DIR/maybe-bounds-where.rs:15:11
|
||||
|
|
||||
LL | struct S5<T>(*const T) where T: ?Trait<'static> + ?Sized;
|
||||
|
||||
@@ -7,5 +7,4 @@ fn main() {
|
||||
once::<&str>("str").fuse().filter(|a: &str| true).count();
|
||||
//~^ ERROR no method named `count`
|
||||
//~| ERROR type mismatch in closure arguments
|
||||
//~| ERROR type mismatch in closure arguments
|
||||
}
|
||||
|
||||
@@ -16,14 +16,6 @@ LL | once::<&str>("str").fuse().filter(|a: &str| true).count();
|
||||
| |
|
||||
| expected signature of `for<'r> fn(&'r &str) -> _`
|
||||
|
||||
error[E0631]: type mismatch in closure arguments
|
||||
--> $DIR/issue-36053-2.rs:7:32
|
||||
|
|
||||
LL | once::<&str>("str").fuse().filter(|a: &str| true).count();
|
||||
| ^^^^^^ -------------- found signature of `for<'r> fn(&'r str) -> _`
|
||||
| |
|
||||
| expected signature of `fn(&&str) -> _`
|
||||
|
||||
error: aborting due to 3 previous errors
|
||||
error: aborting due to 2 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0599`.
|
||||
|
||||
@@ -0,0 +1,15 @@
|
||||
// In this regression test we check that a trailing `|` in an or-pattern just
|
||||
// before the `if` token of a `match` guard will receive parser recovery with
|
||||
// an appropriate error message.
|
||||
|
||||
enum E { A, B }
|
||||
|
||||
fn main() {
|
||||
match E::A {
|
||||
E::A |
|
||||
E::B | //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
if true => {
|
||||
let recovery_witness: bool = 0; //~ ERROR mismatched types
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,20 @@
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/issue-64879-trailing-before-guard.rs:10:14
|
||||
|
|
||||
LL | E::A |
|
||||
| ---- while parsing this or-pattern starting here
|
||||
LL | E::B |
|
||||
| ^ help: remove the `|`
|
||||
|
||||
error[E0308]: mismatched types
|
||||
--> $DIR/issue-64879-trailing-before-guard.rs:12:42
|
||||
|
|
||||
LL | let recovery_witness: bool = 0;
|
||||
| ^ expected bool, found integer
|
||||
|
|
||||
= note: expected type `bool`
|
||||
found type `{integer}`
|
||||
|
||||
error: aborting due to 2 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0308`.
|
||||
@@ -2,37 +2,49 @@ error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:8:15
|
||||
|
|
||||
LL | 1 | 2 || 3 => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:13:16
|
||||
|
|
||||
LL | (1 | 2 || 3) => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:18:16
|
||||
|
|
||||
LL | (1 | 2 || 3,) => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:25:18
|
||||
|
|
||||
LL | TS(1 | 2 || 3) => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:32:23
|
||||
|
|
||||
LL | NS { f: 1 | 2 || 3 } => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:37:16
|
||||
|
|
||||
LL | [1 | 2 || 3] => (),
|
||||
| ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/multiple-pattern-typo.rs:42:9
|
||||
|
||||
@@ -51,24 +51,32 @@ error: a leading `|` is only allowed in a top-level pattern
|
||||
|
|
||||
LL | let ( || A | B) = E::A;
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/or-patterns-syntactic-fail.rs:48:11
|
||||
|
|
||||
LL | let [ || A | B ] = [E::A];
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/or-patterns-syntactic-fail.rs:49:13
|
||||
|
|
||||
LL | let TS( || A | B );
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/or-patterns-syntactic-fail.rs:50:17
|
||||
|
|
||||
LL | let NS { f: || A | B };
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: no rules expected the token `|`
|
||||
--> $DIR/or-patterns-syntactic-fail.rs:14:15
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
// Test the suggestion to remove a leading `|`.
|
||||
// Test the suggestion to remove a leading, or trailing `|`.
|
||||
|
||||
// run-rustfix
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
fn main() {}
|
||||
|
||||
#[cfg(FALSE)]
|
||||
fn leading_vert() {
|
||||
fn leading() {
|
||||
fn fun1( A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
|
||||
fn fun2( A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
|
||||
let ( A): E; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
@@ -21,3 +21,26 @@ fn leading_vert() {
|
||||
let NS { f: A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
let NS { f: A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
}
|
||||
|
||||
#[cfg(FALSE)]
|
||||
fn trailing() {
|
||||
let ( A ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let (a ,): (E,); //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let ( A | B ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let [ A | B ]: [E; 1]; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let S { f: B }; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let ( A | B ): E; //~ ERROR unexpected token `||` after pattern
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
match A {
|
||||
A => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
A => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
A | B => {} //~ ERROR unexpected token `||` after pattern
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
| A | B => {}
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
}
|
||||
|
||||
let a : u8 = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let a = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let a ; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
// Test the suggestion to remove a leading `|`.
|
||||
// Test the suggestion to remove a leading, or trailing `|`.
|
||||
|
||||
// run-rustfix
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
fn main() {}
|
||||
|
||||
#[cfg(FALSE)]
|
||||
fn leading_vert() {
|
||||
fn leading() {
|
||||
fn fun1( | A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
|
||||
fn fun2( || A: E) {} //~ ERROR a leading `|` is not allowed in a parameter pattern
|
||||
let ( | A): E; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
@@ -21,3 +21,26 @@ fn fun2( || A: E) {} //~ ERROR a leading `|` is not allowed in a parameter patte
|
||||
let NS { f: | A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
let NS { f: || A }: NS; //~ ERROR a leading `|` is only allowed in a top-level pattern
|
||||
}
|
||||
|
||||
#[cfg(FALSE)]
|
||||
fn trailing() {
|
||||
let ( A | ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let (a |,): (E,); //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let ( A | B | ): E; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let [ A | B | ]: [E; 1]; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let S { f: B | }; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let ( A || B | ): E; //~ ERROR unexpected token `||` after pattern
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
match A {
|
||||
A | => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
A || => {} //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
A || B | => {} //~ ERROR unexpected token `||` after pattern
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
| A | B | => {}
|
||||
//~^ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
}
|
||||
|
||||
let a | : u8 = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let a | = 0; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
let a | ; //~ ERROR a trailing `|` is not allowed in an or-pattern
|
||||
}
|
||||
|
||||
@@ -9,6 +9,8 @@ error: a leading `|` is not allowed in a parameter pattern
|
||||
|
|
||||
LL | fn fun2( || A: E) {}
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/remove-leading-vert.rs:14:11
|
||||
@@ -21,6 +23,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
||||
|
|
||||
LL | let ( || A): (E);
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/remove-leading-vert.rs:16:11
|
||||
@@ -39,6 +43,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
||||
|
|
||||
LL | let [ || A ]: [E; 1];
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/remove-leading-vert.rs:19:13
|
||||
@@ -51,6 +57,8 @@ error: a leading `|` is only allowed in a top-level pattern
|
||||
|
|
||||
LL | let TS( || A ): TS;
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: a leading `|` is only allowed in a top-level pattern
|
||||
--> $DIR/remove-leading-vert.rs:21:17
|
||||
@@ -63,6 +71,130 @@ error: a leading `|` is only allowed in a top-level pattern
|
||||
|
|
||||
LL | let NS { f: || A }: NS;
|
||||
| ^^ help: remove the `||`
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: aborting due to 11 previous errors
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:27:13
|
||||
|
|
||||
LL | let ( A | ): E;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:28:12
|
||||
|
|
||||
LL | let (a |,): (E,);
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:29:17
|
||||
|
|
||||
LL | let ( A | B | ): E;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:30:17
|
||||
|
|
||||
LL | let [ A | B | ]: [E; 1];
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:31:18
|
||||
|
|
||||
LL | let S { f: B | };
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/remove-leading-vert.rs:32:13
|
||||
|
|
||||
LL | let ( A || B | ): E;
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:32:18
|
||||
|
|
||||
LL | let ( A || B | ): E;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:35:11
|
||||
|
|
||||
LL | A | => {}
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:36:11
|
||||
|
|
||||
LL | A || => {}
|
||||
| - ^^ help: remove the `||`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
|
||||
= note: alternatives in or-patterns are separated with `|`, not `||`
|
||||
|
||||
error: unexpected token `||` after pattern
|
||||
--> $DIR/remove-leading-vert.rs:37:11
|
||||
|
|
||||
LL | A || B | => {}
|
||||
| - ^^ help: use a single `|` to separate multiple alternative patterns: `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:37:16
|
||||
|
|
||||
LL | A || B | => {}
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:39:17
|
||||
|
|
||||
LL | | A | B | => {}
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:43:11
|
||||
|
|
||||
LL | let a | : u8 = 0;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:44:11
|
||||
|
|
||||
LL | let a | = 0;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: a trailing `|` is not allowed in an or-pattern
|
||||
--> $DIR/remove-leading-vert.rs:45:11
|
||||
|
|
||||
LL | let a | ;
|
||||
| - ^ help: remove the `|`
|
||||
| |
|
||||
| while parsing this or-pattern starting here
|
||||
|
||||
error: aborting due to 26 previous errors
|
||||
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
// force-host
|
||||
// no-prefer-dynamic
|
||||
|
||||
#![crate_type = "proc-macro"]
|
||||
|
||||
extern crate proc_macro;
|
||||
|
||||
use proc_macro::TokenStream;
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn nop_attr(_attr: TokenStream, input: TokenStream) -> TokenStream {
|
||||
assert!(_attr.to_string().is_empty());
|
||||
input
|
||||
}
|
||||
|
||||
#[proc_macro_attribute]
|
||||
pub fn no_output(_attr: TokenStream, _input: TokenStream) -> TokenStream {
|
||||
assert!(_attr.to_string().is_empty());
|
||||
assert!(!_input.to_string().is_empty());
|
||||
"".parse().unwrap()
|
||||
}
|
||||
|
||||
#[proc_macro]
|
||||
pub fn emit_input(input: TokenStream) -> TokenStream {
|
||||
input
|
||||
}
|
||||
@@ -7,8 +7,6 @@
|
||||
// normalize-stdout-test "bytes\([^0]\w*\.\.(\w+)\)" -> "bytes(LO..$1)"
|
||||
// normalize-stdout-test "bytes\((\w+)\.\.[^0]\w*\)" -> "bytes($1..HI)"
|
||||
|
||||
#![feature(proc_macro_hygiene)]
|
||||
|
||||
#[macro_use]
|
||||
extern crate test_macros;
|
||||
extern crate dollar_crate_external;
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
// aux-build:lifetimes.rs
|
||||
|
||||
#![feature(proc_macro_hygiene)]
|
||||
|
||||
extern crate lifetimes;
|
||||
|
||||
use lifetimes::*;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
error: expected type, found `'`
|
||||
--> $DIR/lifetimes.rs:9:10
|
||||
--> $DIR/lifetimes.rs:7:10
|
||||
|
|
||||
LL | type A = single_quote_alone!();
|
||||
| ^^^^^^^^^^^^^^^^^^^^^ this macro call doesn't expand to a type
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
extern {
|
||||
#[derive(Copy)] //~ ERROR `derive` may only be applied to structs, enums and unions
|
||||
fn f();
|
||||
}
|
||||
|
||||
fn main() {}
|
||||
@@ -0,0 +1,8 @@
|
||||
error: `derive` may only be applied to structs, enums and unions
|
||||
--> $DIR/macros-in-extern-derive.rs:2:5
|
||||
|
|
||||
LL | #[derive(Copy)]
|
||||
| ^^^^^^^^^^^^^^^
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
// run-pass
|
||||
// aux-build:test-macros-rpass.rs
|
||||
// ignore-wasm32
|
||||
|
||||
#![feature(macros_in_extern)]
|
||||
|
||||
extern crate test_macros_rpass as test_macros;
|
||||
|
||||
use test_macros::{nop_attr, no_output, emit_input};
|
||||
|
||||
fn main() {
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 1isize);
|
||||
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEF);
|
||||
}
|
||||
|
||||
#[link(name = "rust_test_helpers", kind = "static")]
|
||||
extern {
|
||||
#[no_output]
|
||||
fn some_definitely_unknown_symbol_which_should_be_removed();
|
||||
|
||||
#[nop_attr]
|
||||
fn rust_get_test_int() -> isize;
|
||||
|
||||
emit_input!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
|
||||
}
|
||||
+2
-4
@@ -1,3 +1,4 @@
|
||||
// run-pass
|
||||
// aux-build:test-macros.rs
|
||||
// ignore-wasm32
|
||||
|
||||
@@ -5,20 +6,17 @@
|
||||
extern crate test_macros;
|
||||
|
||||
fn main() {
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 0isize);
|
||||
assert_eq!(unsafe { rust_get_test_int() }, 1);
|
||||
assert_eq!(unsafe { rust_dbg_extern_identity_u32(0xDEADBEEF) }, 0xDEADBEEF);
|
||||
}
|
||||
|
||||
#[link(name = "rust_test_helpers", kind = "static")]
|
||||
extern {
|
||||
#[empty_attr]
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
fn some_definitely_unknown_symbol_which_should_be_removed();
|
||||
|
||||
#[identity_attr]
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
fn rust_get_test_int() -> isize;
|
||||
|
||||
identity!(fn rust_dbg_extern_identity_u32(arg: u32) -> u32;);
|
||||
//~^ ERROR macro invocations in `extern {}` blocks are experimental
|
||||
}
|
||||
@@ -0,0 +1,11 @@
|
||||
// check-pass
|
||||
// aux-build:test-macros.rs
|
||||
|
||||
#[macro_use]
|
||||
extern crate test_macros;
|
||||
|
||||
const C: identity!(u8) = 10;
|
||||
|
||||
fn main() {
|
||||
let c: u8 = C;
|
||||
}
|
||||
@@ -50,7 +50,6 @@ fn attrs() {
|
||||
}
|
||||
|
||||
fn main() {
|
||||
let _x: identity!(u32) = 3; //~ ERROR: procedural macros cannot be expanded to types
|
||||
if let identity!(Some(_x)) = Some(3) {}
|
||||
//~^ ERROR: procedural macros cannot be expanded to patterns
|
||||
|
||||
|
||||
@@ -94,17 +94,8 @@ LL | let _x = #[identity_attr] println!();
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to types
|
||||
--> $DIR/proc-macro-gates.rs:53:13
|
||||
|
|
||||
LL | let _x: identity!(u32) = 3;
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to patterns
|
||||
--> $DIR/proc-macro-gates.rs:54:12
|
||||
--> $DIR/proc-macro-gates.rs:53:12
|
||||
|
|
||||
LL | if let identity!(Some(_x)) = Some(3) {}
|
||||
| ^^^^^^^^^^^^^^^^^^^
|
||||
@@ -113,7 +104,7 @@ LL | if let identity!(Some(_x)) = Some(3) {}
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to statements
|
||||
--> $DIR/proc-macro-gates.rs:57:5
|
||||
--> $DIR/proc-macro-gates.rs:56:5
|
||||
|
|
||||
LL | empty!(struct S;);
|
||||
| ^^^^^^^^^^^^^^^^^^
|
||||
@@ -122,7 +113,7 @@ LL | empty!(struct S;);
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to statements
|
||||
--> $DIR/proc-macro-gates.rs:58:5
|
||||
--> $DIR/proc-macro-gates.rs:57:5
|
||||
|
|
||||
LL | empty!(let _x = 3;);
|
||||
| ^^^^^^^^^^^^^^^^^^^^
|
||||
@@ -131,7 +122,7 @@ LL | empty!(let _x = 3;);
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to expressions
|
||||
--> $DIR/proc-macro-gates.rs:60:14
|
||||
--> $DIR/proc-macro-gates.rs:59:14
|
||||
|
|
||||
LL | let _x = identity!(3);
|
||||
| ^^^^^^^^^^^^
|
||||
@@ -140,7 +131,7 @@ LL | let _x = identity!(3);
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error[E0658]: procedural macros cannot be expanded to expressions
|
||||
--> $DIR/proc-macro-gates.rs:61:15
|
||||
--> $DIR/proc-macro-gates.rs:60:15
|
||||
|
|
||||
LL | let _x = [empty!(3)];
|
||||
| ^^^^^^^^^
|
||||
@@ -148,6 +139,6 @@ LL | let _x = [empty!(3)];
|
||||
= note: for more information, see https://github.com/rust-lang/rust/issues/54727
|
||||
= help: add `#![feature(proc_macro_hygiene)]` to the crate attributes to enable
|
||||
|
||||
error: aborting due to 17 previous errors
|
||||
error: aborting due to 16 previous errors
|
||||
|
||||
For more information about this error, try `rustc --explain E0658`.
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | x.foobar();
|
||||
| ^^^^^^ method not found in `u32`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
= note: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
`use crate::foo::foobar::Foobar;`
|
||||
|
||||
error[E0599]: no method named `bar` found for type `u32` in the current scope
|
||||
@@ -15,7 +15,7 @@ LL | x.bar();
|
||||
| ^^^ method not found in `u32`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use crate::foo::Bar;
|
||||
|
|
||||
@@ -33,7 +33,7 @@ LL | let y = u32::from_str("33");
|
||||
| ^^^^^^^^ function or associated item not found in `u32`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use std::str::FromStr;
|
||||
|
|
||||
|
||||
@@ -5,7 +5,7 @@ LL | ().f()
|
||||
| ^ method not found in `()`
|
||||
|
|
||||
= help: items from traits can only be used if the trait is in scope
|
||||
help: the following trait is implemented but not in scope, perhaps add a `use` for it:
|
||||
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
||||
|
|
||||
LL | use foo::T;
|
||||
|
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user